1. Table of Contents


Details.

Details.

1.1 Sample Data


The Wisconsin Breast Cancer dataset shared from the Kaggle website as obtained from the UCI Machine Learning Repository was used for this illustrated example.

Preliminary dataset assessment:

[A] 1138 rows (observations)

[B] 32 columns (variables)
     [B.1] 1/32 metadata (unique identifiers) = id (numeric)
     [B.2] 1/32 response = diagnosis (factor)
     [B.3] 30/32 predictors = 30/30 numeric
            [B.3.1] radius_mean (numeric)
            [B.3.2] texture_mean (numeric)
            [B.3.3] perimeter_mean (numeric)
            [B.3.4] area_mean (numeric)
            [B.3.5] smoothness_mean (numeric)
            [B.3.6] compactness_mean (numeric)
            [B.3.7] concavity_mean (numeric)
            [B.3.8] concave.points_mean (numeric)
            [B.3.9] symmetry_mean (numeric)
            [B.3.10] fractal_dimension_mean (numeric)
            [B.3.11] radius_se (numeric)
            [B.3.12] texture_se (numeric)
            [B.3.13] perimeter_se (numeric)
            [B.3.14] area_se (numeric)
            [B.3.15] smoothness_se (numeric)
            [B.3.16] compactness_se (numeric)
            [B.3.17] concavity_se (numeric)
            [B.3.18] concave.points_se (numeric)
            [B.3.19] symmetry_se (numeric)
            [B.3.20] fractal_dimension_se (numeric)
            [B.3.21] radius_worst (numeric)
            [B.3.22] texture_worst (numeric)
            [B.3.23] perimeter_worst (numeric)
            [B.3.24] area_worst (numeric)
            [B.3.25] smoothness_worst (numeric)
            [B.3.26] compactness_worst (numeric)
            [B.3.27] concavity_worst (numeric)
            [B.3.28] concave.points_worst (numeric)
            [B.3.29] symmetry_worst (numeric)
            [B.3.30] fractal_dimension_worst (numeric)

Code Chunk | Output
##################################
# Loading R libraries
##################################
library(AppliedPredictiveModeling)
library(tidyr)
library(caret)
library(lattice)
library(dplyr)
library(moments)
library(skimr)
library(RANN)
library(pls)
library(corrplot)
library(lares)
library(DMwR)
library(gridExtra)
library(rattle)
library(RColorBrewer)
library(stats)
library(caretEnsemble)
library(pROC)
library(adabag)
library(gbm)
library(xgboost)

##################################
# Loading source and
# formulating the analysis set
##################################
BreastCancer <- read.csv("WisconsinBreastCancer.csv",
                   na.strings=c("NA","NaN"," ",""),
                   stringsAsFactors = FALSE)
BreastCancer <- as.data.frame(BreastCancer)

##################################
# Performing a general exploration of the data set
##################################
dim(BreastCancer)
## [1] 1138   32
str(BreastCancer)
## 'data.frame':    1138 obs. of  32 variables:
##  $ id                     : int  842302 842517 84300903 84348301 84358402 843786 844359 84458202 844981 84501001 ...
##  $ diagnosis              : chr  "M" "M" "M" "M" ...
##  $ radius_mean            : num  18 20.6 19.7 11.4 20.3 ...
##  $ texture_mean           : num  10.4 17.8 21.2 20.4 14.3 ...
##  $ perimeter_mean         : num  122.8 132.9 130 77.6 135.1 ...
##  $ area_mean              : num  1001 1326 1203 386 1297 ...
##  $ smoothness_mean        : num  0.1184 0.0847 0.1096 0.1425 0.1003 ...
##  $ compactness_mean       : num  0.2776 0.0786 0.1599 0.2839 0.1328 ...
##  $ concavity_mean         : num  0.3001 0.0869 0.1974 0.2414 0.198 ...
##  $ concave.points_mean    : num  0.1471 0.0702 0.1279 0.1052 0.1043 ...
##  $ symmetry_mean          : num  0.242 0.181 0.207 0.26 0.181 ...
##  $ fractal_dimension_mean : num  0.0787 0.0567 0.06 0.0974 0.0588 ...
##  $ radius_se              : num  1.095 0.543 0.746 0.496 0.757 ...
##  $ texture_se             : num  0.905 0.734 0.787 1.156 0.781 ...
##  $ perimeter_se           : num  8.59 3.4 4.58 3.44 5.44 ...
##  $ area_se                : num  153.4 74.1 94 27.2 94.4 ...
##  $ smoothness_se          : num  0.0064 0.00522 0.00615 0.00911 0.01149 ...
##  $ compactness_se         : num  0.049 0.0131 0.0401 0.0746 0.0246 ...
##  $ concavity_se           : num  0.0537 0.0186 0.0383 0.0566 0.0569 ...
##  $ concave.points_se      : num  0.0159 0.0134 0.0206 0.0187 0.0188 ...
##  $ symmetry_se            : num  0.03 0.0139 0.0225 0.0596 0.0176 ...
##  $ fractal_dimension_se   : num  0.00619 0.00353 0.00457 0.00921 0.00511 ...
##  $ radius_worst           : num  25.4 25 23.6 14.9 22.5 ...
##  $ texture_worst          : num  17.3 23.4 25.5 26.5 16.7 ...
##  $ perimeter_worst        : num  184.6 158.8 152.5 98.9 152.2 ...
##  $ area_worst             : num  2019 1956 1709 568 1575 ...
##  $ smoothness_worst       : num  0.162 0.124 0.144 0.21 0.137 ...
##  $ compactness_worst      : num  0.666 0.187 0.424 0.866 0.205 ...
##  $ concavity_worst        : num  0.712 0.242 0.45 0.687 0.4 ...
##  $ concave.points_worst   : num  0.265 0.186 0.243 0.258 0.163 ...
##  $ symmetry_worst         : num  0.46 0.275 0.361 0.664 0.236 ...
##  $ fractal_dimension_worst: num  0.1189 0.089 0.0876 0.173 0.0768 ...
summary(BreastCancer)
##        id             diagnosis          radius_mean      texture_mean  
##  Min.   :     8670   Length:1138        Min.   : 6.981   Min.   : 9.71  
##  1st Qu.:   869218   Class :character   1st Qu.:11.700   1st Qu.:16.17  
##  Median :   906024   Mode  :character   Median :13.370   Median :18.84  
##  Mean   : 30371831                      Mean   :14.127   Mean   :19.29  
##  3rd Qu.:  8813129                      3rd Qu.:15.780   3rd Qu.:21.80  
##  Max.   :911320502                      Max.   :28.110   Max.   :39.28  
##  perimeter_mean     area_mean      smoothness_mean   compactness_mean 
##  Min.   : 43.79   Min.   : 143.5   Min.   :0.05263   Min.   :0.01938  
##  1st Qu.: 75.17   1st Qu.: 420.3   1st Qu.:0.08637   1st Qu.:0.06492  
##  Median : 86.24   Median : 551.1   Median :0.09587   Median :0.09263  
##  Mean   : 91.97   Mean   : 654.9   Mean   :0.09636   Mean   :0.10434  
##  3rd Qu.:104.10   3rd Qu.: 782.7   3rd Qu.:0.10530   3rd Qu.:0.13040  
##  Max.   :188.50   Max.   :2501.0   Max.   :0.16340   Max.   :0.34540  
##  concavity_mean    concave.points_mean symmetry_mean    fractal_dimension_mean
##  Min.   :0.00000   Min.   :0.00000     Min.   :0.1060   Min.   :0.04996       
##  1st Qu.:0.02956   1st Qu.:0.02031     1st Qu.:0.1619   1st Qu.:0.05770       
##  Median :0.06154   Median :0.03350     Median :0.1792   Median :0.06154       
##  Mean   :0.08880   Mean   :0.04892     Mean   :0.1812   Mean   :0.06280       
##  3rd Qu.:0.13070   3rd Qu.:0.07400     3rd Qu.:0.1957   3rd Qu.:0.06612       
##  Max.   :0.42680   Max.   :0.20120     Max.   :0.3040   Max.   :0.09744       
##    radius_se        texture_se      perimeter_se       area_se       
##  Min.   :0.1115   Min.   :0.3602   Min.   : 0.757   Min.   :  6.802  
##  1st Qu.:0.2324   1st Qu.:0.8339   1st Qu.: 1.606   1st Qu.: 17.850  
##  Median :0.3242   Median :1.1080   Median : 2.287   Median : 24.530  
##  Mean   :0.4052   Mean   :1.2169   Mean   : 2.866   Mean   : 40.337  
##  3rd Qu.:0.4789   3rd Qu.:1.4740   3rd Qu.: 3.357   3rd Qu.: 45.190  
##  Max.   :2.8730   Max.   :4.8850   Max.   :21.980   Max.   :542.200  
##  smoothness_se      compactness_se      concavity_se     concave.points_se 
##  Min.   :0.001713   Min.   :0.002252   Min.   :0.00000   Min.   :0.000000  
##  1st Qu.:0.005169   1st Qu.:0.013080   1st Qu.:0.01509   1st Qu.:0.007638  
##  Median :0.006380   Median :0.020450   Median :0.02589   Median :0.010930  
##  Mean   :0.007041   Mean   :0.025478   Mean   :0.03189   Mean   :0.011796  
##  3rd Qu.:0.008146   3rd Qu.:0.032450   3rd Qu.:0.04205   3rd Qu.:0.014710  
##  Max.   :0.031130   Max.   :0.135400   Max.   :0.39600   Max.   :0.052790  
##   symmetry_se       fractal_dimension_se  radius_worst   texture_worst  
##  Min.   :0.007882   Min.   :0.0008948    Min.   : 7.93   Min.   :12.02  
##  1st Qu.:0.015160   1st Qu.:0.0022480    1st Qu.:13.01   1st Qu.:21.08  
##  Median :0.018730   Median :0.0031870    Median :14.97   Median :25.41  
##  Mean   :0.020542   Mean   :0.0037949    Mean   :16.27   Mean   :25.68  
##  3rd Qu.:0.023480   3rd Qu.:0.0045580    3rd Qu.:18.79   3rd Qu.:29.72  
##  Max.   :0.078950   Max.   :0.0298400    Max.   :36.04   Max.   :49.54  
##  perimeter_worst    area_worst     smoothness_worst  compactness_worst
##  Min.   : 50.41   Min.   : 185.2   Min.   :0.07117   Min.   :0.02729  
##  1st Qu.: 84.11   1st Qu.: 515.3   1st Qu.:0.11660   1st Qu.:0.14720  
##  Median : 97.66   Median : 686.5   Median :0.13130   Median :0.21190  
##  Mean   :107.26   Mean   : 880.6   Mean   :0.13237   Mean   :0.25427  
##  3rd Qu.:125.40   3rd Qu.:1084.0   3rd Qu.:0.14600   3rd Qu.:0.33910  
##  Max.   :251.20   Max.   :4254.0   Max.   :0.22260   Max.   :1.05800  
##  concavity_worst  concave.points_worst symmetry_worst   fractal_dimension_worst
##  Min.   :0.0000   Min.   :0.00000      Min.   :0.1565   Min.   :0.05504        
##  1st Qu.:0.1145   1st Qu.:0.06493      1st Qu.:0.2504   1st Qu.:0.07146        
##  Median :0.2267   Median :0.09993      Median :0.2822   Median :0.08004        
##  Mean   :0.2722   Mean   :0.11461      Mean   :0.2901   Mean   :0.08395        
##  3rd Qu.:0.3829   3rd Qu.:0.16140      3rd Qu.:0.3179   3rd Qu.:0.09208        
##  Max.   :1.2520   Max.   :0.29100      Max.   :0.6638   Max.   :0.20750
##################################
# Setting the data type
# for the response variable
##################################
BreastCancer$diagnosis <- factor(BreastCancer$diagnosis,
                                 levels = c("M","B"))

##################################
# Formulating a data type assessment summary
##################################
PDA <- BreastCancer
(PDA.Summary <- data.frame(
  Column.Index=c(1:length(names(PDA))),
  Column.Name= names(PDA), 
  Column.Type=sapply(PDA, function(x) class(x)), 
  row.names=NULL)
)
##    Column.Index             Column.Name Column.Type
## 1             1                      id     integer
## 2             2               diagnosis      factor
## 3             3             radius_mean     numeric
## 4             4            texture_mean     numeric
## 5             5          perimeter_mean     numeric
## 6             6               area_mean     numeric
## 7             7         smoothness_mean     numeric
## 8             8        compactness_mean     numeric
## 9             9          concavity_mean     numeric
## 10           10     concave.points_mean     numeric
## 11           11           symmetry_mean     numeric
## 12           12  fractal_dimension_mean     numeric
## 13           13               radius_se     numeric
## 14           14              texture_se     numeric
## 15           15            perimeter_se     numeric
## 16           16                 area_se     numeric
## 17           17           smoothness_se     numeric
## 18           18          compactness_se     numeric
## 19           19            concavity_se     numeric
## 20           20       concave.points_se     numeric
## 21           21             symmetry_se     numeric
## 22           22    fractal_dimension_se     numeric
## 23           23            radius_worst     numeric
## 24           24           texture_worst     numeric
## 25           25         perimeter_worst     numeric
## 26           26              area_worst     numeric
## 27           27        smoothness_worst     numeric
## 28           28       compactness_worst     numeric
## 29           29         concavity_worst     numeric
## 30           30    concave.points_worst     numeric
## 31           31          symmetry_worst     numeric
## 32           32 fractal_dimension_worst     numeric

1.2 Data Quality Assessment


[A] No missing observations noted for any predictor.

[B] Low variance observed for 1 predictor with First.Second.Mode.Ratio>5.
     [B.1] concavity_se = 6.50

[C] No low variance observed for any predictor with Unique.Count.Ratio<0.01.

[D] High skewness observed for 5 predictors with Skewness>3 or Skewness<(-3).
     [D.1] radius_se = +3.08
     [D.2] perimeter_se = +3.43
     [D.3] area_se = +5.43
     [D.4] concavity_se = +5.10
     [D.5] fractal_dimension_se = +3.91

Code Chunk | Output
##################################
# Loading dataset
##################################
DQA <- BreastCancer

##################################
# Formulating an overall data quality assessment summary
##################################
(DQA.Summary <- data.frame(
  Column.Name= names(DQA),
  Column.Type=sapply(DQA, function(x) class(x)),
  Row.Count=sapply(DQA, function(x) nrow(DQA)),
  NA.Count=sapply(DQA,function(x)sum(is.na(x))),
  Fill.Rate=sapply(DQA,function(x)format(round((sum(!is.na(x))/nrow(DQA)),3),nsmall=3)),
  row.names=NULL)
)
##                Column.Name Column.Type Row.Count NA.Count Fill.Rate
## 1                       id     integer      1138        0     1.000
## 2                diagnosis      factor      1138        0     1.000
## 3              radius_mean     numeric      1138        0     1.000
## 4             texture_mean     numeric      1138        0     1.000
## 5           perimeter_mean     numeric      1138        0     1.000
## 6                area_mean     numeric      1138        0     1.000
## 7          smoothness_mean     numeric      1138        0     1.000
## 8         compactness_mean     numeric      1138        0     1.000
## 9           concavity_mean     numeric      1138        0     1.000
## 10     concave.points_mean     numeric      1138        0     1.000
## 11           symmetry_mean     numeric      1138        0     1.000
## 12  fractal_dimension_mean     numeric      1138        0     1.000
## 13               radius_se     numeric      1138        0     1.000
## 14              texture_se     numeric      1138        0     1.000
## 15            perimeter_se     numeric      1138        0     1.000
## 16                 area_se     numeric      1138        0     1.000
## 17           smoothness_se     numeric      1138        0     1.000
## 18          compactness_se     numeric      1138        0     1.000
## 19            concavity_se     numeric      1138        0     1.000
## 20       concave.points_se     numeric      1138        0     1.000
## 21             symmetry_se     numeric      1138        0     1.000
## 22    fractal_dimension_se     numeric      1138        0     1.000
## 23            radius_worst     numeric      1138        0     1.000
## 24           texture_worst     numeric      1138        0     1.000
## 25         perimeter_worst     numeric      1138        0     1.000
## 26              area_worst     numeric      1138        0     1.000
## 27        smoothness_worst     numeric      1138        0     1.000
## 28       compactness_worst     numeric      1138        0     1.000
## 29         concavity_worst     numeric      1138        0     1.000
## 30    concave.points_worst     numeric      1138        0     1.000
## 31          symmetry_worst     numeric      1138        0     1.000
## 32 fractal_dimension_worst     numeric      1138        0     1.000
##################################
# Listing all Predictors
##################################
DQA.Predictors <- DQA[,!names(DQA) %in% c("id","diagnosis")]

##################################
# Listing all numeric Predictors
##################################
DQA.Predictors.Numeric <- DQA.Predictors[,sapply(DQA.Predictors, is.numeric)]

if (length(names(DQA.Predictors.Numeric))>0) {
    print(paste0("There are ",
               (length(names(DQA.Predictors.Numeric))),
               " numeric predictor variable(s)."))
} else {
  print("There are no numeric predictor variables.")
}
## [1] "There are 30 numeric predictor variable(s)."
##################################
# Listing all factor Predictors
##################################
DQA.Predictors.Factor <- DQA.Predictors[,sapply(DQA.Predictors, is.factor)]

if (length(names(DQA.Predictors.Factor))>0) {
    print(paste0("There are ",
               (length(names(DQA.Predictors.Factor))),
               " factor predictor variable(s)."))
} else {
  print("There are no factor predictor variables.")
}
## [1] "There are no factor predictor variables."
##################################
# Formulating a data quality assessment summary for factor Predictors
##################################
if (length(names(DQA.Predictors.Factor))>0) {

  ##################################
  # Formulating a function to determine the first mode
  ##################################
  FirstModes <- function(x) {
    ux <- unique(na.omit(x))
    tab <- tabulate(match(x, ux))
    ux[tab == max(tab)]
  }

  ##################################
  # Formulating a function to determine the second mode
  ##################################
  SecondModes <- function(x) {
    ux <- unique(na.omit(x))
    tab <- tabulate(match(x, ux))
    fm = ux[tab == max(tab)]
    sm = x[!(x %in% fm)]
    usm <- unique(sm)
    tabsm <- tabulate(match(sm, usm))
    ifelse(is.na(usm[tabsm == max(tabsm)])==TRUE,
           return("x"),
           return(usm[tabsm == max(tabsm)]))
  }

  (DQA.Predictors.Factor.Summary <- data.frame(
  Column.Name= names(DQA.Predictors.Factor),
  Column.Type=sapply(DQA.Predictors.Factor, function(x) class(x)),
  Unique.Count=sapply(DQA.Predictors.Factor, function(x) length(unique(x))),
  First.Mode.Value=sapply(DQA.Predictors.Factor, function(x) as.character(FirstModes(x)[1])),
  Second.Mode.Value=sapply(DQA.Predictors.Factor, function(x) as.character(SecondModes(x)[1])),
  First.Mode.Count=sapply(DQA.Predictors.Factor, function(x) sum(na.omit(x) == FirstModes(x)[1])),
  Second.Mode.Count=sapply(DQA.Predictors.Factor, function(x) sum(na.omit(x) == SecondModes(x)[1])),
  Unique.Count.Ratio=sapply(DQA.Predictors.Factor, function(x) format(round((length(unique(x))/nrow(DQA.Predictors.Factor)),3), nsmall=3)),
  First.Second.Mode.Ratio=sapply(DQA.Predictors.Factor, function(x) format(round((sum(na.omit(x) == FirstModes(x)[1])/sum(na.omit(x) == SecondModes(x)[1])),3), nsmall=3)),
  row.names=NULL)
  )

}

##################################
# Formulating a data quality assessment summary for numeric Predictors
##################################
if (length(names(DQA.Predictors.Numeric))>0) {

  ##################################
  # Formulating a function to determine the first mode
  ##################################
  FirstModes <- function(x) {
    ux <- unique(na.omit(x))
    tab <- tabulate(match(x, ux))
    ux[tab == max(tab)]
  }

  ##################################
  # Formulating a function to determine the second mode
  ##################################
  SecondModes <- function(x) {
    ux <- unique(na.omit(x))
    tab <- tabulate(match(x, ux))
    fm = ux[tab == max(tab)]
    sm = na.omit(x)[!(na.omit(x) %in% fm)]
    usm <- unique(sm)
    tabsm <- tabulate(match(sm, usm))
    ifelse(is.na(usm[tabsm == max(tabsm)])==TRUE,
           return(0.00001),
           return(usm[tabsm == max(tabsm)]))
  }

  (DQA.Predictors.Numeric.Summary <- data.frame(
  Column.Name= names(DQA.Predictors.Numeric),
  Column.Type=sapply(DQA.Predictors.Numeric, function(x) class(x)),
  Unique.Count=sapply(DQA.Predictors.Numeric, function(x) length(unique(x))),
  Unique.Count.Ratio=sapply(DQA.Predictors.Numeric, function(x) format(round((length(unique(x))/nrow(DQA.Predictors.Numeric)),3), nsmall=3)),
  First.Mode.Value=sapply(DQA.Predictors.Numeric, function(x) format(round((FirstModes(x)[1]),3),nsmall=3)),
  Second.Mode.Value=sapply(DQA.Predictors.Numeric, function(x) format(round((SecondModes(x)[1]),3),nsmall=3)),
  First.Mode.Count=sapply(DQA.Predictors.Numeric, function(x) sum(na.omit(x) == FirstModes(x)[1])),
  Second.Mode.Count=sapply(DQA.Predictors.Numeric, function(x) sum(na.omit(x) == SecondModes(x)[1])),
  First.Second.Mode.Ratio=sapply(DQA.Predictors.Numeric, function(x) format(round((sum(na.omit(x) == FirstModes(x)[1])/sum(na.omit(x) == SecondModes(x)[1])),3), nsmall=3)),
  Minimum=sapply(DQA.Predictors.Numeric, function(x) format(round(min(x,na.rm = TRUE),3), nsmall=3)),
  Mean=sapply(DQA.Predictors.Numeric, function(x) format(round(mean(x,na.rm = TRUE),3), nsmall=3)),
  Median=sapply(DQA.Predictors.Numeric, function(x) format(round(median(x,na.rm = TRUE),3), nsmall=3)),
  Maximum=sapply(DQA.Predictors.Numeric, function(x) format(round(max(x,na.rm = TRUE),3), nsmall=3)),
  Skewness=sapply(DQA.Predictors.Numeric, function(x) format(round(skewness(x,na.rm = TRUE),3), nsmall=3)),
  Kurtosis=sapply(DQA.Predictors.Numeric, function(x) format(round(kurtosis(x,na.rm = TRUE),3), nsmall=3)),
  Percentile25th=sapply(DQA.Predictors.Numeric, function(x) format(round(quantile(x,probs=0.25,na.rm = TRUE),3), nsmall=3)),
  Percentile75th=sapply(DQA.Predictors.Numeric, function(x) format(round(quantile(x,probs=0.75,na.rm = TRUE),3), nsmall=3)),
  row.names=NULL)
  )

}
##                Column.Name Column.Type Unique.Count Unique.Count.Ratio
## 1              radius_mean     numeric          456              0.401
## 2             texture_mean     numeric          479              0.421
## 3           perimeter_mean     numeric          522              0.459
## 4                area_mean     numeric          539              0.474
## 5          smoothness_mean     numeric          474              0.417
## 6         compactness_mean     numeric          537              0.472
## 7           concavity_mean     numeric          537              0.472
## 8      concave.points_mean     numeric          542              0.476
## 9            symmetry_mean     numeric          432              0.380
## 10  fractal_dimension_mean     numeric          499              0.438
## 11               radius_se     numeric          540              0.475
## 12              texture_se     numeric          519              0.456
## 13            perimeter_se     numeric          533              0.468
## 14                 area_se     numeric          528              0.464
## 15           smoothness_se     numeric          547              0.481
## 16          compactness_se     numeric          541              0.475
## 17            concavity_se     numeric          533              0.468
## 18       concave.points_se     numeric          507              0.446
## 19             symmetry_se     numeric          498              0.438
## 20    fractal_dimension_se     numeric          545              0.479
## 21            radius_worst     numeric          457              0.402
## 22           texture_worst     numeric          511              0.449
## 23         perimeter_worst     numeric          514              0.452
## 24              area_worst     numeric          544              0.478
## 25        smoothness_worst     numeric          411              0.361
## 26       compactness_worst     numeric          529              0.465
## 27         concavity_worst     numeric          539              0.474
## 28    concave.points_worst     numeric          492              0.432
## 29          symmetry_worst     numeric          500              0.439
## 30 fractal_dimension_worst     numeric          535              0.470
##    First.Mode.Value Second.Mode.Value First.Mode.Count Second.Mode.Count
## 1            12.340            13.000                8                 6
## 2            15.700            21.250                6                 4
## 3            82.610           132.900                6                 4
## 4           512.200           658.800                6                 4
## 5             0.101             0.108               10                 8
## 6             0.121             0.160                6                 4
## 7             0.000             0.120               26                 6
## 8             0.000             0.029               26                 6
## 9             0.177             0.181                8                 6
## 10            0.057             0.059                6                 4
## 11            0.286             0.298                6                 4
## 12            1.150             0.734                6                 4
## 13            1.778             2.406                8                 4
## 14           16.970            74.080                6                 4
## 15            0.006             0.005                4                 2
## 16            0.023             0.014                6                 4
## 17            0.000             0.017               26                 4
## 18            0.000             0.012               26                 6
## 19            0.013             0.015                8                 6
## 20            0.003             0.006                4                 2
## 21           12.360            13.340               10                 8
## 22           27.260            27.660                6                 4
## 23          117.700           184.600                6                 4
## 24         1269.000          2019.000                4                 2
## 25            0.131             0.149                8                 6
## 26            0.342             0.177                6                 4
## 27            0.000             0.450               26                 6
## 28            0.000             0.026               26                 6
## 29            0.320             0.361                6                 4
## 30            0.074             0.084                6                 4
##    First.Second.Mode.Ratio Minimum    Mean  Median  Maximum Skewness Kurtosis
## 1                    1.333   6.981  14.127  13.370   28.110    0.940    3.828
## 2                    1.500   9.710  19.290  18.840   39.280    0.649    3.741
## 3                    1.500  43.790  91.969  86.240  188.500    0.988    3.953
## 4                    1.500 143.500 654.889 551.100 2501.000    1.641    6.610
## 5                    1.250   0.053   0.096   0.096    0.163    0.455    3.838
## 6                    1.500   0.019   0.104   0.093    0.345    1.187    4.625
## 7                    4.333   0.000   0.089   0.062    0.427    1.397    4.971
## 8                    4.333   0.000   0.049   0.034    0.201    1.168    4.047
## 9                    1.333   0.106   0.181   0.179    0.304    0.724    4.266
## 10                   1.500   0.050   0.063   0.062    0.097    1.301    5.969
## 11                   1.500   0.112   0.405   0.324    2.873    3.080   20.521
## 12                   1.500   0.360   1.217   1.108    4.885    1.642    8.292
## 13                   2.000   0.757   2.866   2.287   21.980    3.435   24.204
## 14                   1.500   6.802  40.337  24.530  542.200    5.433   51.767
## 15                   2.000   0.002   0.007   0.006    0.031    2.308   13.368
## 16                   1.500   0.002   0.025   0.020    0.135    1.897    8.051
## 17                   6.500   0.000   0.032   0.026    0.396    5.097   51.423
## 18                   4.333   0.000   0.012   0.011    0.053    1.441    8.071
## 19                   1.333   0.008   0.021   0.019    0.079    2.189   10.816
## 20                   2.000   0.001   0.004   0.003    0.030    3.914   29.040
## 21                   1.250   7.930  16.269  14.970   36.040    1.100    3.925
## 22                   1.500  12.020  25.677  25.410   49.540    0.497    3.212
## 23                   1.500  50.410 107.261  97.660  251.200    1.125    4.050
## 24                   2.000 185.200 880.583 686.500 4254.000    1.854    7.347
## 25                   1.333   0.071   0.132   0.131    0.223    0.414    3.503
## 26                   1.500   0.027   0.254   0.212    1.058    1.470    6.002
## 27                   4.333   0.000   0.272   0.227    1.252    1.147    4.591
## 28                   4.333   0.000   0.115   0.100    0.291    0.491    2.459
## 29                   1.500   0.156   0.290   0.282    0.664    1.430    7.395
## 30                   1.500   0.055   0.084   0.080    0.208    1.658    8.188
##    Percentile25th Percentile75th
## 1          11.700         15.780
## 2          16.170         21.800
## 3          75.170        104.100
## 4         420.300        782.700
## 5           0.086          0.105
## 6           0.065          0.130
## 7           0.030          0.131
## 8           0.020          0.074
## 9           0.162          0.196
## 10          0.058          0.066
## 11          0.232          0.479
## 12          0.834          1.474
## 13          1.606          3.357
## 14         17.850         45.190
## 15          0.005          0.008
## 16          0.013          0.032
## 17          0.015          0.042
## 18          0.008          0.015
## 19          0.015          0.023
## 20          0.002          0.005
## 21         13.010         18.790
## 22         21.080         29.720
## 23         84.110        125.400
## 24        515.300       1084.000
## 25          0.117          0.146
## 26          0.147          0.339
## 27          0.114          0.383
## 28          0.065          0.161
## 29          0.250          0.318
## 30          0.071          0.092
##################################
# Identifying potential data quality issues
##################################

##################################
# Checking for missing observations
##################################
if ((nrow(DQA.Summary[DQA.Summary$NA.Count>0,]))>0){
  print(paste0("Missing observations noted for ",
               (nrow(DQA.Summary[DQA.Summary$NA.Count>0,])),
               " variable(s) with NA.Count>0 and Fill.Rate<1.0."))
  DQA.Summary[DQA.Summary$NA.Count>0,]
} else {
  print("No missing observations noted.")
}
## [1] "No missing observations noted."
##################################
# Checking for zero or near-zero variance Predictors
##################################
if (length(names(DQA.Predictors.Factor))==0) {
  print("No factor predictors noted.")
} else if (nrow(DQA.Predictors.Factor.Summary[as.numeric(as.character(DQA.Predictors.Factor.Summary$First.Second.Mode.Ratio))>5,])>0){
  print(paste0("Low variance observed for ",
               (nrow(DQA.Predictors.Factor.Summary[as.numeric(as.character(DQA.Predictors.Factor.Summary$First.Second.Mode.Ratio))>5,])),
               " factor variable(s) with First.Second.Mode.Ratio>5."))
  DQA.Predictors.Factor.Summary[as.numeric(as.character(DQA.Predictors.Factor.Summary$First.Second.Mode.Ratio))>5,]
} else {
  print("No low variance factor predictors due to high first-second mode ratio noted.")
}
## [1] "No factor predictors noted."
if (length(names(DQA.Predictors.Numeric))==0) {
  print("No numeric predictors noted.")
} else if (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$First.Second.Mode.Ratio))>5,])>0){
  print(paste0("Low variance observed for ",
               (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$First.Second.Mode.Ratio))>5,])),
               " numeric variable(s) with First.Second.Mode.Ratio>5."))
  DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$First.Second.Mode.Ratio))>5,]
} else {
  print("No low variance numeric predictors due to high first-second mode ratio noted.")
}
## [1] "Low variance observed for 1 numeric variable(s) with First.Second.Mode.Ratio>5."
##     Column.Name Column.Type Unique.Count Unique.Count.Ratio First.Mode.Value
## 17 concavity_se     numeric          533              0.468            0.000
##    Second.Mode.Value First.Mode.Count Second.Mode.Count First.Second.Mode.Ratio
## 17             0.017               26                 4                   6.500
##    Minimum  Mean Median Maximum Skewness Kurtosis Percentile25th Percentile75th
## 17   0.000 0.032  0.026   0.396    5.097   51.423          0.015          0.042
if (length(names(DQA.Predictors.Numeric))==0) {
  print("No numeric predictors noted.")
} else if (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Unique.Count.Ratio))<0.01,])>0){
  print(paste0("Low variance observed for ",
               (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Unique.Count.Ratio))<0.01,])),
               " numeric variable(s) with Unique.Count.Ratio<0.01."))
  DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Unique.Count.Ratio))<0.01,]
} else {
  print("No low variance numeric predictors due to low unique count ratio noted.")
}
## [1] "No low variance numeric predictors due to low unique count ratio noted."
##################################
# Checking for skewed Predictors
##################################
if (length(names(DQA.Predictors.Numeric))==0) {
  print("No numeric predictors noted.")
} else if (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))>3 |
                                               as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))<(-3),])>0){
  print(paste0("High skewness observed for ",
  (nrow(DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))>3 |
                                               as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))<(-3),])),
  " numeric variable(s) with Skewness>3 or Skewness<(-3)."))
  DQA.Predictors.Numeric.Summary[as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))>3 |
                                 as.numeric(as.character(DQA.Predictors.Numeric.Summary$Skewness))<(-3),]
} else {
  print("No skewed numeric predictors noted.")
}
## [1] "High skewness observed for 5 numeric variable(s) with Skewness>3 or Skewness<(-3)."
##             Column.Name Column.Type Unique.Count Unique.Count.Ratio
## 11            radius_se     numeric          540              0.475
## 13         perimeter_se     numeric          533              0.468
## 14              area_se     numeric          528              0.464
## 17         concavity_se     numeric          533              0.468
## 20 fractal_dimension_se     numeric          545              0.479
##    First.Mode.Value Second.Mode.Value First.Mode.Count Second.Mode.Count
## 11            0.286             0.298                6                 4
## 13            1.778             2.406                8                 4
## 14           16.970            74.080                6                 4
## 17            0.000             0.017               26                 4
## 20            0.003             0.006                4                 2
##    First.Second.Mode.Ratio Minimum   Mean Median Maximum Skewness Kurtosis
## 11                   1.500   0.112  0.405  0.324   2.873    3.080   20.521
## 13                   2.000   0.757  2.866  2.287  21.980    3.435   24.204
## 14                   1.500   6.802 40.337 24.530 542.200    5.433   51.767
## 17                   6.500   0.000  0.032  0.026   0.396    5.097   51.423
## 20                   2.000   0.001  0.004  0.003   0.030    3.914   29.040
##    Percentile25th Percentile75th
## 11          0.232          0.479
## 13          1.606          3.357
## 14         17.850         45.190
## 17          0.015          0.042
## 20          0.002          0.005

1.3 Data Preprocessing

1.3.1 Outlier Detection


[A] Outliers noted for 29 out of the 30 predictors. Predictor values were visualized through a boxplot including observations classified as suspected outliers using the IQR criterion. The IQR criterion means that all observations above the (75th percentile + 1.5 x IQR) or below the (25th percentile - 1.5 x IQR) are suspected outliers, where IQR is the difference between the third quartile (75th percentile) and first quartile (25th percentile).
     [A.1] radius_mean = 28
     [A.2] texture_mean = 14
     [A.3] perimeter_mean = 26
     [A.4] area_mean = 50
     [A.5] smoothness_mean = 12
     [A.6] compactness_mean = 32
     [A.7] concavity_mean = 36
     [A.8] concave.points_mean = 20
     [A.9] symmetry_mean = 30
     [A.10] fractal_dimension_mean = 30
     [A.11] radius_se = 76
     [A.12] texture_se = 40
     [A.13] perimeter_se = 76
     [A.14] area_se = 130
     [A.15] smoothness_se = 60
     [A.16] compactness_se = 56
     [A.17] concavity_se = 44
     [A.18] concave.points_se = 38
     [A.19] symmetry_se = 54
     [A.20] fractal_dimension_se = 56
     [A.21] radius_worst = 34
     [A.22] texture_worst = 10
     [A.23] perimeter_worst = 30
     [A.24] area_worst = 70
     [A.25] smoothness_worst = 14
     [A.26] compactness_worst = 32
     [A.27] concavity_worst = 24
     [A.28] symmetry_worst = 46
     [A.29] fractal_dimension_worst = 48

Code Chunk | Output
##################################
# Loading dataset
##################################
DPA <- DQA[,!names(DQA) %in% c("id")]

##################################
# Gathering descriptive statistics
##################################
(DPA_Skimmed <- skim(DPA)) 
Data summary
Name DPA
Number of rows 1138
Number of columns 31
_______________________
Column type frequency:
factor 1
numeric 30
________________________
Group variables None

Variable type: factor

skim_variable n_missing complete_rate ordered n_unique top_counts
diagnosis 0 1 FALSE 2 B: 714, M: 424

Variable type: numeric

skim_variable n_missing complete_rate mean sd p0 p25 p50 p75 p100 hist
radius_mean 0 1 14.13 3.52 6.98 11.70 13.37 15.78 28.11 ▂▇▃▁▁
texture_mean 0 1 19.29 4.30 9.71 16.17 18.84 21.80 39.28 ▃▇▃▁▁
perimeter_mean 0 1 91.97 24.29 43.79 75.17 86.24 104.10 188.50 ▃▇▃▁▁
area_mean 0 1 654.89 351.76 143.50 420.30 551.10 782.70 2501.00 ▇▃▂▁▁
smoothness_mean 0 1 0.10 0.01 0.05 0.09 0.10 0.11 0.16 ▁▇▇▁▁
compactness_mean 0 1 0.10 0.05 0.02 0.06 0.09 0.13 0.35 ▇▇▂▁▁
concavity_mean 0 1 0.09 0.08 0.00 0.03 0.06 0.13 0.43 ▇▃▂▁▁
concave.points_mean 0 1 0.05 0.04 0.00 0.02 0.03 0.07 0.20 ▇▃▂▁▁
symmetry_mean 0 1 0.18 0.03 0.11 0.16 0.18 0.20 0.30 ▁▇▅▁▁
fractal_dimension_mean 0 1 0.06 0.01 0.05 0.06 0.06 0.07 0.10 ▆▇▂▁▁
radius_se 0 1 0.41 0.28 0.11 0.23 0.32 0.48 2.87 ▇▁▁▁▁
texture_se 0 1 1.22 0.55 0.36 0.83 1.11 1.47 4.88 ▇▅▁▁▁
perimeter_se 0 1 2.87 2.02 0.76 1.61 2.29 3.36 21.98 ▇▁▁▁▁
area_se 0 1 40.34 45.47 6.80 17.85 24.53 45.19 542.20 ▇▁▁▁▁
smoothness_se 0 1 0.01 0.00 0.00 0.01 0.01 0.01 0.03 ▇▃▁▁▁
compactness_se 0 1 0.03 0.02 0.00 0.01 0.02 0.03 0.14 ▇▃▁▁▁
concavity_se 0 1 0.03 0.03 0.00 0.02 0.03 0.04 0.40 ▇▁▁▁▁
concave.points_se 0 1 0.01 0.01 0.00 0.01 0.01 0.01 0.05 ▇▇▁▁▁
symmetry_se 0 1 0.02 0.01 0.01 0.02 0.02 0.02 0.08 ▇▃▁▁▁
fractal_dimension_se 0 1 0.00 0.00 0.00 0.00 0.00 0.00 0.03 ▇▁▁▁▁
radius_worst 0 1 16.27 4.83 7.93 13.01 14.97 18.79 36.04 ▆▇▃▁▁
texture_worst 0 1 25.68 6.14 12.02 21.08 25.41 29.72 49.54 ▃▇▆▁▁
perimeter_worst 0 1 107.26 33.59 50.41 84.11 97.66 125.40 251.20 ▇▇▃▁▁
area_worst 0 1 880.58 569.11 185.20 515.30 686.50 1084.00 4254.00 ▇▂▁▁▁
smoothness_worst 0 1 0.13 0.02 0.07 0.12 0.13 0.15 0.22 ▂▇▇▂▁
compactness_worst 0 1 0.25 0.16 0.03 0.15 0.21 0.34 1.06 ▇▅▁▁▁
concavity_worst 0 1 0.27 0.21 0.00 0.11 0.23 0.38 1.25 ▇▅▂▁▁
concave.points_worst 0 1 0.11 0.07 0.00 0.06 0.10 0.16 0.29 ▅▇▅▃▁
symmetry_worst 0 1 0.29 0.06 0.16 0.25 0.28 0.32 0.66 ▅▇▁▁▁
fractal_dimension_worst 0 1 0.08 0.02 0.06 0.07 0.08 0.09 0.21 ▇▃▁▁▁
##################################
# Outlier Detection
##################################

##################################
# Listing all Predictors
##################################
DPA.Predictors <- DPA[,!names(DPA) %in% c("diagnosis")]

##################################
# Listing all numeric Predictors
##################################
DPA.Predictors.Numeric <- DPA.Predictors[,sapply(DPA.Predictors, is.numeric)]

##################################
# Identifying outliers for the numeric Predictors
##################################
OutlierCountList <- c()

for (i in 1:ncol(DPA.Predictors.Numeric)) {
  Outliers <- boxplot.stats(DPA.Predictors.Numeric[,i])$out
  OutlierCount <- length(Outliers)
  OutlierCountList <- append(OutlierCountList,OutlierCount)
  OutlierIndices <- which(DPA.Predictors.Numeric[,i] %in% c(Outliers))
  print(
  ggplot(DPA.Predictors.Numeric, aes(x=DPA.Predictors.Numeric[,i])) +
  geom_boxplot() +
  theme_bw() +
  theme(axis.text.y=element_blank(), 
        axis.ticks.y=element_blank()) +
  xlab(names(DPA.Predictors.Numeric)[i]) +
  labs(title=names(DPA.Predictors.Numeric)[i],
       subtitle=paste0(OutlierCount, " Outlier(s) Detected")))
}

1.3.2 Zero and Near-Zero Variance


[A] No low variance observed for any predictor using a preprocessing summary from the caret package. The nearZeroVar method using both the freqCut and uniqueCut criteria set at 95/5 and 10, respectively, were applied on the dataset.

Code Chunk | Output
##################################
# Zero and Near-Zero Variance
##################################

##################################
# Identifying columns with low variance
###################################
DPA_LowVariance <- nearZeroVar(DPA,
                               freqCut = 80/20,
                               uniqueCut = 10,
                               saveMetrics= TRUE)
(DPA_LowVariance[DPA_LowVariance$nzv,])
## [1] freqRatio     percentUnique zeroVar       nzv          
## <0 rows> (or 0-length row.names)
if ((nrow(DPA_LowVariance[DPA_LowVariance$nzv,]))==0){
  
  print("No low variance descriptors noted.")
  
} else {

  print(paste0("Low variance observed for ",
               (nrow(DPA_LowVariance[DPA_LowVariance$nzv,])),
               " numeric variable(s) with First.Second.Mode.Ratio>4 and Unique.Count.Ratio<0.10."))
  
  DPA_LowVarianceForRemoval <- (nrow(DPA_LowVariance[DPA_LowVariance$nzv,]))
  
  print(paste0("Low variance can be resolved by removing ",
               (nrow(DPA_LowVariance[DPA_LowVariance$nzv,])),
               " numeric variable(s)."))
  
  for (j in 1:DPA_LowVarianceForRemoval) {
  DPA_LowVarianceRemovedVariable <- rownames(DPA_LowVariance[DPA_LowVariance$nzv,])[j]
  print(paste0("Variable ",
               j,
               " for removal: ",
               DPA_LowVarianceRemovedVariable))
  }
  
  DPA %>%
  skim() %>%
  dplyr::filter(skim_variable %in% rownames(DPA_LowVariance[DPA_LowVariance$nzv,]))

}
## [1] "No low variance descriptors noted."

1.3.3 Collinearity


[A] High correlation values were noted for 15 pairs of numeric predictors with Pearson correlation coefficients >80% as confirmed using the preprocessing summaries from the caret package.
     [A.1] radius_mean and perimeter_mean = +100%
     [A.2] radius_worst and perimeter_worst = +99%
     [A.3] radius_mean and area_mean = +99%
     [A.4] perimeter_mean and area_mean = +99%
     [A.5] radius_worst and area_worst = +98%
     [A.6] perimeter_worst and area_worst = +98%
     [A.7] radius_se and perimeter_se = +97%
     [A.8] perimeter_mean and perimeter_worst = +97%
     [A.9] radius_mean and radius_worst = +97%
     [A.10] perimeter_mean and radius_worst = +97%%
     [A.11] radius_mean and perimeter_worst = +96%
     [A.12] area_mean and radius_worst = +96%
     [A.13] area_mean and area_worst = +96%
     [A.14] area_mean and perimeter_worst = +96%
     [A.15] radius_se and area_se = +95%

[B] 7 predictors driving high pairwise correlation were recommended for removal using the findCorrelation preprocessing method from the caret package. The function looks at the mean absolute correlation of each predictor and removes that with the largest mean absolute correlation.
     [B.1] perimeter_worst
     [B.2] radius_worst
     [B.3] perimeter_mean
     [B.4] area_worst
     [B.5] radius_mean
     [B.6] perimeter_se
     [B.7] area_se

Code Chunk | Output
##################################
# Visualizing pairwise correlation between Predictor
##################################
(DPA_Correlation <- cor(DPA.Predictors.Numeric,
                        method = "pearson",
                        use="pairwise.complete.obs"))
##                          radius_mean texture_mean perimeter_mean    area_mean
## radius_mean              1.000000000  0.323781891    0.997855281  0.987357170
## texture_mean             0.323781891  1.000000000    0.329533059  0.321085696
## perimeter_mean           0.997855281  0.329533059    1.000000000  0.986506804
## area_mean                0.987357170  0.321085696    0.986506804  1.000000000
## smoothness_mean          0.170581187 -0.023388516    0.207278164  0.177028377
## compactness_mean         0.506123578  0.236702222    0.556936211  0.498501682
## concavity_mean           0.676763550  0.302417828    0.716135650  0.685982829
## concave.points_mean      0.822528522  0.293464051    0.850977041  0.823268869
## symmetry_mean            0.147741242  0.071400980    0.183027212  0.151293079
## fractal_dimension_mean  -0.311630826 -0.076437183   -0.261476908 -0.283109812
## radius_se                0.679090388  0.275868676    0.691765014  0.732562227
## texture_se              -0.097317443  0.386357623   -0.086761078 -0.066280214
## perimeter_se             0.674171616  0.281673115    0.693134890  0.726628328
## area_se                  0.735863663  0.259844987    0.744982694  0.800085921
## smoothness_se           -0.222600125  0.006613777   -0.202694026 -0.166776667
## compactness_se           0.205999980  0.191974611    0.250743681  0.212582551
## concavity_se             0.194203623  0.143293077    0.228082345  0.207660060
## concave.points_se        0.376168956  0.163851025    0.407216916  0.372320282
## symmetry_se             -0.104320881  0.009127168   -0.081629327 -0.072496588
## fractal_dimension_se    -0.042641269  0.054457520   -0.005523391 -0.019886963
## radius_worst             0.969538973  0.352572947    0.969476363  0.962746086
## texture_worst            0.297007644  0.912044589    0.303038372  0.287488627
## perimeter_worst          0.965136514  0.358039575    0.970386887  0.959119574
## area_worst               0.941082460  0.343545947    0.941549808  0.959213326
## smoothness_worst         0.119616140  0.077503359    0.150549404  0.123522939
## compactness_worst        0.413462823  0.277829592    0.455774228  0.390410309
## concavity_worst          0.526911462  0.301025224    0.563879263  0.512605920
## concave.points_worst     0.744214198  0.295315843    0.771240789  0.722016626
## symmetry_worst           0.163953335  0.105007910    0.189115040  0.143569914
## fractal_dimension_worst  0.007065886  0.119205351    0.051018530  0.003737597
##                         smoothness_mean compactness_mean concavity_mean
## radius_mean                  0.17058119       0.50612358     0.67676355
## texture_mean                -0.02338852       0.23670222     0.30241783
## perimeter_mean               0.20727816       0.55693621     0.71613565
## area_mean                    0.17702838       0.49850168     0.68598283
## smoothness_mean              1.00000000       0.65912322     0.52198377
## compactness_mean             0.65912322       1.00000000     0.88312067
## concavity_mean               0.52198377       0.88312067     1.00000000
## concave.points_mean          0.55369517       0.83113504     0.92139103
## symmetry_mean                0.55777479       0.60264105     0.50066662
## fractal_dimension_mean       0.58479200       0.56536866     0.33678336
## radius_se                    0.30146710       0.49747345     0.63192482
## texture_se                   0.06840645       0.04620483     0.07621835
## perimeter_se                 0.29609193       0.54890526     0.66039079
## area_se                      0.24655243       0.45565285     0.61742681
## smoothness_se                0.33237544       0.13529927     0.09856375
## compactness_se               0.31894330       0.73872179     0.67027882
## concavity_se                 0.24839568       0.57051687     0.69127021
## concave.points_se            0.38067569       0.64226185     0.68325992
## symmetry_se                  0.20077438       0.22997659     0.17800921
## fractal_dimension_se         0.28360670       0.50731813     0.44930075
## radius_worst                 0.21312014       0.53531540     0.68823641
## texture_worst                0.03607180       0.24813283     0.29987889
## perimeter_worst              0.23885263       0.59021043     0.72956492
## area_worst                   0.20671836       0.50960381     0.67598723
## smoothness_worst             0.80532420       0.56554117     0.44882204
## compactness_worst            0.47246844       0.86580904     0.75496802
## concavity_worst              0.43492571       0.81627525     0.88410264
## concave.points_worst         0.50305335       0.81557322     0.86132303
## symmetry_worst               0.39430948       0.51022343     0.40946413
## fractal_dimension_worst      0.49931637       0.68738232     0.51492989
##                         concave.points_mean symmetry_mean
## radius_mean                      0.82252852    0.14774124
## texture_mean                     0.29346405    0.07140098
## perimeter_mean                   0.85097704    0.18302721
## area_mean                        0.82326887    0.15129308
## smoothness_mean                  0.55369517    0.55777479
## compactness_mean                 0.83113504    0.60264105
## concavity_mean                   0.92139103    0.50066662
## concave.points_mean              1.00000000    0.46249739
## symmetry_mean                    0.46249739    1.00000000
## fractal_dimension_mean           0.16691738    0.47992133
## radius_se                        0.69804983    0.30337926
## texture_se                       0.02147958    0.12805293
## perimeter_se                     0.71064987    0.31389276
## area_se                          0.69029854    0.22397022
## smoothness_se                    0.02765331    0.18732117
## compactness_se                   0.49042425    0.42165915
## concavity_se                     0.43916707    0.34262702
## concave.points_se                0.61563413    0.39329787
## symmetry_se                      0.09535079    0.44913654
## fractal_dimension_se             0.25758375    0.33178615
## radius_worst                     0.83031763    0.18572775
## texture_worst                    0.29275171    0.09065069
## perimeter_worst                  0.85592313    0.21916856
## area_worst                       0.80962962    0.17719338
## smoothness_worst                 0.45275305    0.42667503
## compactness_worst                0.66745368    0.47320001
## concavity_worst                  0.75239950    0.43372101
## concave.points_worst             0.91015531    0.43029661
## symmetry_worst                   0.37574415    0.69982580
## fractal_dimension_worst          0.36866113    0.43841350
##                         fractal_dimension_mean    radius_se  texture_se
## radius_mean                      -0.3116308263 0.6790903880 -0.09731744
## texture_mean                     -0.0764371834 0.2758686762  0.38635762
## perimeter_mean                   -0.2614769081 0.6917650135 -0.08676108
## area_mean                        -0.2831098117 0.7325622270 -0.06628021
## smoothness_mean                   0.5847920019 0.3014670983  0.06840645
## compactness_mean                  0.5653686634 0.4974734461  0.04620483
## concavity_mean                    0.3367833594 0.6319248221  0.07621835
## concave.points_mean               0.1669173832 0.6980498336  0.02147958
## symmetry_mean                     0.4799213301 0.3033792632  0.12805293
## fractal_dimension_mean            1.0000000000 0.0001109951  0.16417397
## radius_se                         0.0001109951 1.0000000000  0.21324734
## texture_se                        0.1641739659 0.2132473373  1.00000000
## perimeter_se                      0.0398299316 0.9727936770  0.22317073
## area_se                          -0.0901702475 0.9518301121  0.11156725
## smoothness_se                     0.4019644254 0.1645142198  0.39724285
## compactness_se                    0.5598366906 0.3560645755  0.23169970
## concavity_se                      0.4466303217 0.3323575376  0.19499846
## concave.points_se                 0.3411980444 0.5133464414  0.23028340
## symmetry_se                       0.3450073971 0.2405673625  0.41162068
## fractal_dimension_se              0.6881315775 0.2277535327  0.27972275
## radius_worst                     -0.2536914949 0.7150651951 -0.11169031
## texture_worst                    -0.0512692020 0.1947985568  0.40900277
## perimeter_worst                  -0.2051512113 0.7196838037 -0.10224192
## area_worst                       -0.2318544512 0.7515484761 -0.08319499
## smoothness_worst                  0.5049420754 0.1419185529 -0.07365766
## compactness_worst                 0.4587981567 0.2871031656 -0.09243935
## concavity_worst                   0.3462338763 0.3805846346 -0.06895622
## concave.points_worst              0.1753254492 0.5310623278 -0.11963752
## symmetry_worst                    0.3340186839 0.0945428304 -0.12821476
## fractal_dimension_worst           0.7672967792 0.0495594325 -0.04565457
##                         perimeter_se     area_se smoothness_se compactness_se
## radius_mean               0.67417162  0.73586366  -0.222600125      0.2060000
## texture_mean              0.28167311  0.25984499   0.006613777      0.1919746
## perimeter_mean            0.69313489  0.74498269  -0.202694026      0.2507437
## area_mean                 0.72662833  0.80008592  -0.166776667      0.2125826
## smoothness_mean           0.29609193  0.24655243   0.332375443      0.3189433
## compactness_mean          0.54890526  0.45565285   0.135299268      0.7387218
## concavity_mean            0.66039079  0.61742681   0.098563746      0.6702788
## concave.points_mean       0.71064987  0.69029854   0.027653308      0.4904242
## symmetry_mean             0.31389276  0.22397022   0.187321165      0.4216591
## fractal_dimension_mean    0.03982993 -0.09017025   0.401964425      0.5598367
## radius_se                 0.97279368  0.95183011   0.164514220      0.3560646
## texture_se                0.22317073  0.11156725   0.397242853      0.2316997
## perimeter_se              1.00000000  0.93765541   0.151075331      0.4163224
## area_se                   0.93765541  1.00000000   0.075150338      0.2848401
## smoothness_se             0.15107533  0.07515034   1.000000000      0.3366961
## compactness_se            0.41632237  0.28484006   0.336696081      1.0000000
## concavity_se              0.36248158  0.27089473   0.268684760      0.8012683
## concave.points_se         0.55626408  0.41572957   0.328429499      0.7440827
## symmetry_se               0.26648709  0.13410898   0.413506125      0.3947128
## fractal_dimension_se      0.24414277  0.12707090   0.427374207      0.8032688
## radius_worst              0.69720059  0.75737319  -0.230690710      0.2046072
## texture_worst             0.20037085  0.19649665  -0.074742965      0.1430026
## perimeter_worst           0.72103131  0.76121264  -0.217303755      0.2605158
## area_worst                0.73071297  0.81140796  -0.182195478      0.1993713
## smoothness_worst          0.13005439  0.12538943   0.314457456      0.2273942
## compactness_worst         0.34191945  0.28325654  -0.055558139      0.6787804
## concavity_worst           0.41889882  0.38510014  -0.058298387      0.6391467
## concave.points_worst      0.55489723  0.53816631  -0.102006796      0.4832083
## symmetry_worst            0.10993043  0.07412629  -0.107342098      0.2778784
## fractal_dimension_worst   0.08543257  0.01753930   0.101480315      0.5909728
##                         concavity_se concave.points_se  symmetry_se
## radius_mean                0.1942036        0.37616896 -0.104320881
## texture_mean               0.1432931        0.16385103  0.009127168
## perimeter_mean             0.2280823        0.40721692 -0.081629327
## area_mean                  0.2076601        0.37232028 -0.072496588
## smoothness_mean            0.2483957        0.38067569  0.200774376
## compactness_mean           0.5705169        0.64226185  0.229976591
## concavity_mean             0.6912702        0.68325992  0.178009208
## concave.points_mean        0.4391671        0.61563413  0.095350787
## symmetry_mean              0.3426270        0.39329787  0.449136542
## fractal_dimension_mean     0.4466303        0.34119804  0.345007397
## radius_se                  0.3323575        0.51334644  0.240567362
## texture_se                 0.1949985        0.23028340  0.411620680
## perimeter_se               0.3624816        0.55626408  0.266487092
## area_se                    0.2708947        0.41572957  0.134108980
## smoothness_se              0.2686848        0.32842950  0.413506125
## compactness_se             0.8012683        0.74408267  0.394712835
## concavity_se               1.0000000        0.77180399  0.309428578
## concave.points_se          0.7718040        1.00000000  0.312780223
## symmetry_se                0.3094286        0.31278022  1.000000000
## fractal_dimension_se       0.7273722        0.61104414  0.369078083
## radius_worst               0.1869035        0.35812667 -0.128120769
## texture_worst              0.1002410        0.08674121 -0.077473420
## perimeter_worst            0.2266804        0.39499925 -0.103753044
## area_worst                 0.1883527        0.34227116 -0.110342743
## smoothness_worst           0.1684813        0.21535060 -0.012661800
## compactness_worst          0.4848578        0.45288838  0.060254879
## concavity_worst            0.6625641        0.54959238  0.037119049
## concave.points_worst       0.4404723        0.60244961 -0.030413396
## symmetry_worst             0.1977878        0.14311567  0.389402485
## fractal_dimension_worst    0.4393293        0.31065455  0.078079476
##                         fractal_dimension_se radius_worst texture_worst
## radius_mean                     -0.042641269   0.96953897   0.297007644
## texture_mean                     0.054457520   0.35257295   0.912044589
## perimeter_mean                  -0.005523391   0.96947636   0.303038372
## area_mean                       -0.019886963   0.96274609   0.287488627
## smoothness_mean                  0.283606699   0.21312014   0.036071799
## compactness_mean                 0.507318127   0.53531540   0.248132833
## concavity_mean                   0.449300749   0.68823641   0.299878889
## concave.points_mean              0.257583746   0.83031763   0.292751713
## symmetry_mean                    0.331786146   0.18572775   0.090650688
## fractal_dimension_mean           0.688131577  -0.25369149  -0.051269202
## radius_se                        0.227753533   0.71506520   0.194798557
## texture_se                       0.279722748  -0.11169031   0.409002766
## perimeter_se                     0.244142773   0.69720059   0.200370854
## area_se                          0.127070903   0.75737319   0.196496649
## smoothness_se                    0.427374207  -0.23069071  -0.074742965
## compactness_se                   0.803268818   0.20460717   0.143002583
## concavity_se                     0.727372184   0.18690352   0.100240984
## concave.points_se                0.611044139   0.35812667   0.086741210
## symmetry_se                      0.369078083  -0.12812077  -0.077473420
## fractal_dimension_se             1.000000000  -0.03748762  -0.003195029
## radius_worst                    -0.037487618   1.00000000   0.359920754
## texture_worst                   -0.003195029   0.35992075   1.000000000
## perimeter_worst                 -0.001000398   0.99370792   0.365098245
## area_worst                      -0.022736147   0.98401456   0.345842283
## smoothness_worst                 0.170568316   0.21657443   0.225429415
## compactness_worst                0.390158842   0.47582004   0.360832339
## concavity_worst                  0.379974661   0.57397471   0.368365607
## concave.points_worst             0.215204013   0.78742385   0.359754610
## symmetry_worst                   0.111093956   0.24352920   0.233027461
## fractal_dimension_worst          0.591328066   0.09349198   0.219122425
##                         perimeter_worst  area_worst smoothness_worst
## radius_mean                 0.965136514  0.94108246       0.11961614
## texture_mean                0.358039575  0.34354595       0.07750336
## perimeter_mean              0.970386887  0.94154981       0.15054940
## area_mean                   0.959119574  0.95921333       0.12352294
## smoothness_mean             0.238852626  0.20671836       0.80532420
## compactness_mean            0.590210428  0.50960381       0.56554117
## concavity_mean              0.729564917  0.67598723       0.44882204
## concave.points_mean         0.855923128  0.80962962       0.45275305
## symmetry_mean               0.219168559  0.17719338       0.42667503
## fractal_dimension_mean     -0.205151211 -0.23185445       0.50494208
## radius_se                   0.719683804  0.75154848       0.14191855
## texture_se                 -0.102241922 -0.08319499      -0.07365766
## perimeter_se                0.721031310  0.73071297       0.13005439
## area_se                     0.761212636  0.81140796       0.12538943
## smoothness_se              -0.217303755 -0.18219548       0.31445746
## compactness_se              0.260515840  0.19937133       0.22739423
## concavity_se                0.226680426  0.18835265       0.16848132
## concave.points_se           0.394999252  0.34227116       0.21535060
## symmetry_se                -0.103753044 -0.11034274      -0.01266180
## fractal_dimension_se       -0.001000398 -0.02273615       0.17056832
## radius_worst                0.993707916  0.98401456       0.21657443
## texture_worst               0.365098245  0.34584228       0.22542941
## perimeter_worst             1.000000000  0.97757809       0.23677460
## area_worst                  0.977578091  1.00000000       0.20914533
## smoothness_worst            0.236774604  0.20914533       1.00000000
## compactness_worst           0.529407690  0.43829628       0.56818652
## concavity_worst             0.618344080  0.54333053       0.51852329
## concave.points_worst        0.816322102  0.74741880       0.54769090
## symmetry_worst              0.269492769  0.20914551       0.49383833
## fractal_dimension_worst     0.138956862  0.07964703       0.61762419
##                         compactness_worst concavity_worst concave.points_worst
## radius_mean                    0.41346282      0.52691146            0.7442142
## texture_mean                   0.27782959      0.30102522            0.2953158
## perimeter_mean                 0.45577423      0.56387926            0.7712408
## area_mean                      0.39041031      0.51260592            0.7220166
## smoothness_mean                0.47246844      0.43492571            0.5030534
## compactness_mean               0.86580904      0.81627525            0.8155732
## concavity_mean                 0.75496802      0.88410264            0.8613230
## concave.points_mean            0.66745368      0.75239950            0.9101553
## symmetry_mean                  0.47320001      0.43372101            0.4302966
## fractal_dimension_mean         0.45879816      0.34623388            0.1753254
## radius_se                      0.28710317      0.38058463            0.5310623
## texture_se                    -0.09243935     -0.06895622           -0.1196375
## perimeter_se                   0.34191945      0.41889882            0.5548972
## area_se                        0.28325654      0.38510014            0.5381663
## smoothness_se                 -0.05555814     -0.05829839           -0.1020068
## compactness_se                 0.67878035      0.63914670            0.4832083
## concavity_se                   0.48485780      0.66256413            0.4404723
## concave.points_se              0.45288838      0.54959238            0.6024496
## symmetry_se                    0.06025488      0.03711905           -0.0304134
## fractal_dimension_se           0.39015884      0.37997466            0.2152040
## radius_worst                   0.47582004      0.57397471            0.7874239
## texture_worst                  0.36083234      0.36836561            0.3597546
## perimeter_worst                0.52940769      0.61834408            0.8163221
## area_worst                     0.43829628      0.54333053            0.7474188
## smoothness_worst               0.56818652      0.51852329            0.5476909
## compactness_worst              1.00000000      0.89226090            0.8010804
## concavity_worst                0.89226090      1.00000000            0.8554339
## concave.points_worst           0.80108036      0.85543386            1.0000000
## symmetry_worst                 0.61444050      0.53251973            0.5025285
## fractal_dimension_worst        0.81045486      0.68651092            0.5111141
##                         symmetry_worst fractal_dimension_worst
## radius_mean                 0.16395333             0.007065886
## texture_mean                0.10500791             0.119205351
## perimeter_mean              0.18911504             0.051018530
## area_mean                   0.14356991             0.003737597
## smoothness_mean             0.39430948             0.499316369
## compactness_mean            0.51022343             0.687382323
## concavity_mean              0.40946413             0.514929891
## concave.points_mean         0.37574415             0.368661134
## symmetry_mean               0.69982580             0.438413498
## fractal_dimension_mean      0.33401868             0.767296779
## radius_se                   0.09454283             0.049559432
## texture_se                 -0.12821476            -0.045654569
## perimeter_se                0.10993043             0.085432572
## area_se                     0.07412629             0.017539295
## smoothness_se              -0.10734210             0.101480315
## compactness_se              0.27787843             0.590972763
## concavity_se                0.19778782             0.439329269
## concave.points_se           0.14311567             0.310654551
## symmetry_se                 0.38940248             0.078079476
## fractal_dimension_se        0.11109396             0.591328066
## radius_worst                0.24352920             0.093491979
## texture_worst               0.23302746             0.219122425
## perimeter_worst             0.26949277             0.138956862
## area_worst                  0.20914551             0.079647034
## smoothness_worst            0.49383833             0.617624192
## compactness_worst           0.61444050             0.810454856
## concavity_worst             0.53251973             0.686510921
## concave.points_worst        0.50252849             0.511114146
## symmetry_worst              1.00000000             0.537848206
## fractal_dimension_worst     0.53784821             1.000000000
DPA_CorrelationTest <- cor.mtest(DPA.Predictors.Numeric,
                       method = "pearson",
                       conf.level = 0.95)

corrplot(cor(DPA.Predictors.Numeric,
             method = "pearson",
             use="pairwise.complete.obs"),
             method = "circle",
             type = "upper",
             order = "original",
             tl.col = "black",
             tl.cex = 0.75,
             tl.srt = 90,
             sig.level = 0.05,
             p.mat = DPA_CorrelationTest$p,
             insig = "blank")

corrplot(cor(DPA.Predictors.Numeric,
             method = "pearson",
             use="pairwise.complete.obs"),
             method = "number",
             type = "upper",
             order = "original",
             tl.col = "black",
             tl.cex = 0.75,
             tl.srt = 90,
             sig.level = 0.05,
             number.cex = 0.65,
             p.mat = DPA_CorrelationTest$p,
             insig = "blank")

##################################
# Identifying the highly correlated variables
##################################
(DPA_HighlyCorrelatedCount <- sum(abs(DPA_Correlation[upper.tri(DPA_Correlation)])>0.95))
## [1] 15
if (DPA_HighlyCorrelatedCount == 0) {
  print("No highly correlated predictors noted.")
} else {
  print(paste0("High correlation observed for ",
               (DPA_HighlyCorrelatedCount),
               " pairs of numeric variable(s) with Correlation.Coefficient>0.95."))
  
  (DPA_HighlyCorrelatedPairs <- corr_cross(DPA.Predictors.Numeric,
  max_pvalue = 0.05, 
  top = DPA_HighlyCorrelatedCount,
  rm.na = TRUE,
  grid = FALSE
))
  
}
## [1] "High correlation observed for 15 pairs of numeric variable(s) with Correlation.Coefficient>0.95."

if (DPA_HighlyCorrelatedCount > 0) {
  DPA_HighlyCorrelated <- findCorrelation(DPA_Correlation, cutoff = 0.95)

  (DPA_HighlyCorrelatedForRemoval <- length(DPA_HighlyCorrelated))

  print(paste0("High correlation can be resolved by removing ",
               (DPA_HighlyCorrelatedForRemoval),
               " numeric variable(s)."))

  for (j in 1:DPA_HighlyCorrelatedForRemoval) {
  DPA_HighlyCorrelatedRemovedVariable <- colnames(DPA.Predictors.Numeric)[DPA_HighlyCorrelated[j]]
  print(paste0("Variable ",
               j,
               " for removal: ",
               DPA_HighlyCorrelatedRemovedVariable))
  }

}
## [1] "High correlation can be resolved by removing 7 numeric variable(s)."
## [1] "Variable 1 for removal: perimeter_worst"
## [1] "Variable 2 for removal: radius_worst"
## [1] "Variable 3 for removal: perimeter_mean"
## [1] "Variable 4 for removal: area_worst"
## [1] "Variable 5 for removal: radius_mean"
## [1] "Variable 6 for removal: perimeter_se"
## [1] "Variable 7 for removal: area_se"

1.3.4 Linear Dependency


[A] No linear dependencies noted for any subset of numeric variables using the preprocessing summary from the caret package applying the findLinearCombos method which utilizes the QR decomposition of a matrix to enumerate sets of linear combinations (if they exist).

Code Chunk | Output
##################################
# Linear Dependencies
##################################

##################################
# Finding linear dependencies
##################################
DPA_LinearlyDependent <- findLinearCombos(DPA.Predictors.Numeric)

##################################
# Identifying the linearly dependent variables
##################################
DPA_LinearlyDependent <- findLinearCombos(DPA.Predictors.Numeric)

(DPA_LinearlyDependentCount <- length(DPA_LinearlyDependent$linearCombos))
## [1] 0
if (DPA_LinearlyDependentCount == 0) {
  print("No linearly dependent predictors noted.")
} else {
  print(paste0("Linear dependency observed for ",
               (DPA_LinearlyDependentCount),
               " subset(s) of numeric variable(s)."))
  
  for (i in 1:DPA_LinearlyDependentCount) {
    DPA_LinearlyDependentSubset <- colnames(DPA.Predictors.Numeric)[DPA_LinearlyDependent$linearCombos[[i]]]
    print(paste0("Linear dependent variable(s) for subset ",
                 i,
                 " include: ",
                 DPA_LinearlyDependentSubset))
  }
  
}
## [1] "No linearly dependent predictors noted."
##################################
# Identifying the linearly dependent variables for removal
##################################

if (DPA_LinearlyDependentCount > 0) {
  DPA_LinearlyDependent <- findLinearCombos(DPA.Predictors.Numeric)
  
  DPA_LinearlyDependentForRemoval <- length(DPA_LinearlyDependent$remove)
  
  print(paste0("Linear dependency can be resolved by removing ",
               (DPA_LinearlyDependentForRemoval),
               " numeric variable(s)."))
  
  for (j in 1:DPA_LinearlyDependentForRemoval) {
  DPA_LinearlyDependentRemovedVariable <- colnames(DPA.Predictors.Numeric)[DPA_LinearlyDependent$remove[j]]
  print(paste0("Variable ",
               j,
               " for removal: ",
               DPA_LinearlyDependentRemovedVariable))
  }

}

1.3.5 Distributional Shape


[A] Shape transformation was applied to improve against skewness and minimize outliers for data distribution stability using the BoxCox method from the caret package which transforms the distributional shape for predictors with strictly positive values.

[B] Skewness measurements were improved for most except for 1 predictor with Skewness>3.
     [B.1] concavity_se = +5.10

[C] Outliers were minimized for most except for 5 predictors which did not show any improvement even after shape transformation as noted using the IQR criterion.
     [C.1] concavity_mean = 36
     [C.2] concave.points_mean = 20
     [C.3] concavity_se = 44
     [C.4] concave.points_se = 38
     [C.5] concavity_worst = 24

Code Chunk | Output
##################################
# Shape Transformation
##################################

##################################
# Applying a Box-Cox transformation
##################################
DPA_BoxCox <- preProcess(DPA.Predictors.Numeric, method = c("BoxCox"))
DPA_BoxCoxTransformed <- predict(DPA_BoxCox, DPA.Predictors.Numeric)

for (i in 1:ncol(DPA_BoxCoxTransformed)) {
  Median <- format(round(median(DPA_BoxCoxTransformed[,i],na.rm = TRUE),2), nsmall=2)
  Mean <- format(round(mean(DPA_BoxCoxTransformed[,i],na.rm = TRUE),2), nsmall=2)
  Skewness <- format(round(skewness(DPA_BoxCoxTransformed[,i],na.rm = TRUE),2), nsmall=2)
  print(
  ggplot(DPA_BoxCoxTransformed, aes(x=DPA_BoxCoxTransformed[,i])) +
  geom_histogram(binwidth=1,color="black", fill="white") +
  geom_vline(aes(xintercept=mean(DPA_BoxCoxTransformed[,i])),
            color="blue", size=1) +
    geom_vline(aes(xintercept=median(DPA_BoxCoxTransformed[,i])),
            color="red", size=1) +
  theme_bw() +
  ylab("Count") +
  xlab(names(DPA_BoxCoxTransformed)[i]) +
  labs(title=names(DPA_BoxCoxTransformed)[i],
       subtitle=paste0("Median = ", Median,
                       ", Mean = ", Mean,
                       ", Skewness = ", Skewness)))
}

##################################
# Identifying outliers for the numeric predictors
##################################
OutlierCountList <- c()

for (i in 1:ncol(DPA_BoxCoxTransformed)) {
  Outliers <- boxplot.stats(DPA_BoxCoxTransformed[,i])$out
  OutlierCount <- length(Outliers)
  OutlierCountList <- append(OutlierCountList,OutlierCount)
  OutlierIndices <- which(DPA_BoxCoxTransformed[,i] %in% c(Outliers))
  print(
  ggplot(DPA_BoxCoxTransformed, aes(x=DPA_BoxCoxTransformed[,i])) +
  geom_boxplot() +
  theme_bw() +
  theme(axis.text.y=element_blank(), 
        axis.ticks.y=element_blank()) +
  xlab(names(DPA_BoxCoxTransformed)[i]) +
  labs(title=names(DPA_BoxCoxTransformed)[i],
       subtitle=paste0(OutlierCount, " Outlier(s) Detected")))
}

DPA_BoxCoxTransformed$diagnosis <- DPA[,c("diagnosis")]

1.3.6 Pre-Processed Dataset


[A] A total of 12 predictors were removed prior to data exploration and modelling due to issues identified during data preprocessing.
     [A.1] concavity_se = Low variance and high skewness
     [A.2] perimeter_worst = High correlation with radius_worst, area_worst, perimeter_mean, radius_mean and area_mean
     [A.3] radius_worst = High correlation with perimeter_worst, area_worst, radius_mean, perimeter_mean and area_mean
     [A.4] perimeter_mean = High correlation with radius_mean, area_mean, perimeter_worst and radius_worst
     [A.5] area_worst = High correlation with radius_worst, perimeter_worst and area_mean
     [A.6] radius_mean = High correlation with perimeter_mean, area_mean, radius_worst and perimeter_worst .
     [A.7] perimeter_se = High correlation with radius_se
     [A.8] area_se = High correlation with radius_se
     [A.9] concavity_mean = High outlier count even after shape transformation
     [A.10] concave.points_mean = High outlier count even after shape transformation
     [A.11] concave.points_se = High outlier count even after shape transformation
     [A.12] concavity_worst = High outlier count even after shape transformation

[B] The preprocessed tabular dataset was comprised of 1138 observations and 19 variables (including 1 response and 18 predictors).
     [B.1] 1138 rows (observations)
     [B.2] 19 columns (variables)
            [B.2.1] 1/19 response = diagnosis (factor)
            [B.2.2] 18/19 predictors = 18/18 numeric
                     [B.2.2.1] texture_mean (numeric)
                     [B.2.2.2] area_mean (numeric)
                     [B.2.2.3] smoothness_mean (numeric)
                     [B.2.2.4] compactness_mean (numeric)
                     [B.2.2.5] symmetry_mean (numeric)
                     [B.2.2.6] fractal_dimension_mean (numeric)
                     [B.2.2.7] radius_se (numeric)
                     [B.2.2.8] texture_se (numeric)
                     [B.2.2.9] smoothness_se (numeric)
                     [B.2.2.10] compactness_se (numeric)
                     [B.2.2.11] symmetry_se (numeric)
                     [B.2.2.12] fractal_dimension_se (numeric)
                     [B.2.2.13] texture_worst (numeric)
                     [B.2.2.14] smoothness_worst (numeric)
                     [B.2.2.15] compactness_worst (numeric)
                     [B.2.2.16] concave.points_worst (numeric)
                     [B.2.2.17] symmetry_worst (numeric)
                     [B.2.2.18] fractal_dimension_worst (numeric)

Code Chunk | Output
##################################
# Creating the pre-modelling
# train set
##################################
PMA <- DPA_BoxCoxTransformed[,!names(DPA_BoxCoxTransformed) %in% c("concavity_se",
                                                                   "perimeter_worst",
                                                                   "radius_worst",
                                                                   "perimeter_mean",
                                                                   "area_worst",
                                                                   "radius_mean",
                                                                   "perimeter_se",
                                                                   "area_se",
                                                                   "concavity_mean",
                                                                   "concave.points_mean",
                                                                   "concave.points_se",
                                                                   "concavity_worst")]

##################################
# Gathering descriptive statistics
##################################
(PMA_Skimmed <- skim(PMA))
Data summary
Name PMA
Number of rows 1138
Number of columns 19
_______________________
Column type frequency:
factor 1
numeric 18
________________________
Group variables None

Variable type: factor

skim_variable n_missing complete_rate ordered n_unique top_counts
diagnosis 0 1 FALSE 2 B: 714, M: 424

Variable type: numeric

skim_variable n_missing complete_rate mean sd p0 p25 p50 p75 p100 hist
texture_mean 0 1 2.94 0.22 2.27 2.78 2.94 3.08 3.67 ▁▅▇▃▁
area_mean 0 1 6.36 0.48 4.97 6.04 6.31 6.66 7.82 ▁▅▇▃▁
smoothness_mean 0 1 -2.35 0.15 -2.94 -2.45 -2.34 -2.25 -1.81 ▁▂▇▃▁
compactness_mean 0 1 -2.38 0.49 -3.94 -2.73 -2.38 -2.04 -1.06 ▁▅▇▇▂
symmetry_mean 0 1 -2.26 0.25 -3.20 -2.42 -2.25 -2.10 -1.43 ▁▂▇▅▁
fractal_dimension_mean 0 1 -130.58 26.03 -199.82 -149.68 -131.52 -113.87 -52.16 ▁▆▇▃▁
radius_se 0 1 -1.42 0.81 -3.51 -1.98 -1.42 -0.86 0.86 ▁▆▇▅▁
texture_se 0 1 0.10 0.43 -1.02 -0.18 0.10 0.39 1.59 ▂▆▇▂▁
smoothness_se 0 1 -11.83 1.66 -19.20 -12.84 -11.85 -10.78 -6.11 ▁▂▇▅▁
compactness_se 0 1 -3.88 0.65 -6.10 -4.34 -3.89 -3.43 -2.00 ▁▃▇▆▁
symmetry_se 0 1 -16.51 3.52 -28.80 -18.91 -16.46 -14.16 -5.98 ▁▃▇▅▁
fractal_dimension_se 0 1 -15.48 2.88 -24.04 -17.43 -15.37 -13.46 -6.23 ▁▅▇▃▁
texture_worst 0 1 4.53 0.46 3.22 4.20 4.55 4.85 5.91 ▁▅▇▅▁
smoothness_worst 0 1 -1.52 0.09 -1.82 -1.58 -1.52 -1.46 -1.21 ▁▃▇▃▁
compactness_worst 0 1 -1.55 0.62 -3.60 -1.92 -1.55 -1.08 0.06 ▁▃▇▆▁
concave.points_worst 0 1 0.11 0.07 0.00 0.06 0.10 0.16 0.29 ▅▇▅▃▁
symmetry_worst 0 1 -1.77 0.37 -3.06 -2.00 -1.76 -1.55 -0.45 ▁▃▇▂▁
fractal_dimension_worst 0 1 -19.62 4.79 -32.59 -22.99 -19.73 -16.32 -5.17 ▁▅▇▃▁

1.4 Data Exploration


[A] Individual predictors which demonstrated excellent discrimination between diagnosis=M and diagnosis=B in terms of the area under the receiver operating characteristics curve (AUROC>0.80) are as follows:
     [A.1] concave.points_worst = 0.97
     [A.2] area_mean = 0.94
     [A.3] radius_se = 0.87
     [A.4] compactness_mean = 0.86
     [A.5] compactness_worst = 0.86

[B] To allow a better comparison of the ensemble methods, only predictors which demonstrated fair discrimination between diagnosis=M and diagnosis=B in terms of the area under the receiver operating characteristics curve (0.70<AUROC<0.80) were selected to proceed with the modelling process, enumerated as follows:
     [B.1] texture_worst = 0.78
     [B.2] texture_mean = 0.77
     [B.3] smoothness_worst = 0.75
     [B.4] symmetry_worst = 0.74
     [B.5] compactness_se = 0.73
     [B.6] smoothness_mean = 0.72

Code Chunk | Output
##################################
# Loading dataset
##################################
DPA <- PMA

##################################
# Listing all predictors
##################################
DPA.Predictors <- DPA[,!names(DPA) %in% c("diagnosis")]

##################################
# Listing all numeric predictors
##################################
DPA.Predictors.Numeric <- DPA.Predictors[,sapply(DPA.Predictors, is.numeric)]
ncol(DPA.Predictors.Numeric)
## [1] 18
##################################
# Converting response variable data type to factor
##################################
DPA$diagnosis <- as.factor(DPA$diagnosis)
length(levels(DPA$diagnosis))
## [1] 2
##################################
# Formulating the box plots
##################################
featurePlot(x = DPA.Predictors.Numeric, 
            y = DPA$diagnosis,
            plot = "box",
            scales = list(x = list(relation="free", rot = 90), 
                          y = list(relation="free")),
            adjust = 1.5, 
            pch = "|", 
            layout = c(6, 3))

##################################
# Obtaining the AUROC
##################################
AUROC <- filterVarImp(x = DPA.Predictors.Numeric,
                        y = DPA$diagnosis)

##################################
# Formulating the summary table
##################################
AUROC_Summary <- AUROC 

AUROC_Summary$Predictor <- rownames(AUROC)
names(AUROC_Summary)[1] <- "AUROC"
AUROC_Summary$Metric <- rep("AUROC",nrow(AUROC))

AUROC_Summary[order(AUROC_Summary$AUROC, decreasing=TRUE),] 
##                             AUROC         B               Predictor Metric
## concave.points_worst    0.9667037 0.9667037    concave.points_worst  AUROC
## area_mean               0.9383159 0.9383159               area_mean  AUROC
## radius_se               0.8683341 0.8683341               radius_se  AUROC
## compactness_mean        0.8637823 0.8637823        compactness_mean  AUROC
## compactness_worst       0.8623025 0.8623025       compactness_worst  AUROC
## texture_worst           0.7846308 0.7846308           texture_worst  AUROC
## texture_mean            0.7758245 0.7758245            texture_mean  AUROC
## smoothness_worst        0.7540563 0.7540563        smoothness_worst  AUROC
## symmetry_worst          0.7369391 0.7369391          symmetry_worst  AUROC
## compactness_se          0.7272805 0.7272805          compactness_se  AUROC
## smoothness_mean         0.7220416 0.7220416         smoothness_mean  AUROC
## symmetry_mean           0.6985624 0.6985624           symmetry_mean  AUROC
## fractal_dimension_worst 0.6859706 0.6859706 fractal_dimension_worst  AUROC
## fractal_dimension_se    0.6203028 0.6203028    fractal_dimension_se  AUROC
## symmetry_se             0.5551107 0.5551107             symmetry_se  AUROC
## smoothness_se           0.5311625 0.5311625           smoothness_se  AUROC
## fractal_dimension_mean  0.5154656 0.5154656  fractal_dimension_mean  AUROC
## texture_se              0.5115943 0.5115943              texture_se  AUROC
##################################
# Exploring predictor performance
##################################
dotplot(Predictor ~ AUROC | Metric, 
        AUROC_Summary,
        origin = 0,
        type = c("p", "h"),
        pch = 16,
        cex = 2,
        alpha = 0.45,
        prepanel = function(x, y) {
            list(ylim = levels(reorder(y, x)))
        },
        panel = function(x, y, ...) {
            panel.dotplot(x, reorder(y, x), ...)
        })

##################################
# Creating the pre-modelling dataset
# into the train and test sets
##################################
DPA <- DPA[,colnames(DPA) %in% c("diagnosis",
                                 "texture_worst",
                                 "texture_mean",
                                 "smoothness_worst",
                                 "symmetry_worst",
                                 "compactness_se",
                                 "smoothness_mean")]
set.seed(12345678)
MA_Train_Index  <- createDataPartition(DPA$diagnosis,p=0.8)[[1]]
MA_Train        <- DPA[ MA_Train_Index, ]
MA_Test         <- DPA[-MA_Train_Index, ]

1.5 Model Boosting

1.5.1 Adaptive Boosting (MBS_AB)


Details.

Code Chunk | Output
##################################
# Setting the cross validation process
# using the Repeated K-Fold
##################################
set.seed(12345678)
RKFold_Control <- trainControl(method="repeatedcv",
                              summaryFunction = twoClassSummary,
                              number=5,
                              repeats=5,
                              classProbs = TRUE)

##################################
# Setting the conditions
# for hyperparameter tuning
##################################
AB_Grid = expand.grid(mfinal = c(50,100,100),
                      maxdepth = c(4,5,6),
                      coeflearn = "Breiman")

##################################
# Running the adaptive boosting model
# by setting the caret method to 'AdaBoost.M1'
##################################
set.seed(12345678)
MBS_AB_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                 y = MA_Train$diagnosis,
                 method = "AdaBoost.M1",
                 tuneGrid = AB_Grid,
                 metric = "ROC",
                 trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
MBS_AB_Tune
## AdaBoost.M1 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   maxdepth  mfinal  ROC        Sens       Spec     
##   4          50     0.9506186  0.8800000  0.9335561
##   4         100     0.9575073  0.8994118  0.9384409
##   5          50     0.9575898  0.8941176  0.9412265
##   5         100     0.9631176  0.8982353  0.9412265
##   6          50     0.9619147  0.8988235  0.9415927
##   6         100     0.9647554  0.9011765  0.9398352
## 
## Tuning parameter 'coeflearn' was held constant at a value of Breiman
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were mfinal = 100, maxdepth = 6
##  and coeflearn = Breiman.
MBS_AB_Tune$finalModel
## $formula
## .outcome ~ .
## <environment: 0x000000003ec71dd8>
## 
## $trees
## $trees[[1]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 345 B (0.378289474 0.621710526)  
##     2) smoothness_worst>=-1.500665 380 139 M (0.634210526 0.365789474)  
##       4) texture_mean>=2.931727 222  34 M (0.846846847 0.153153153)  
##         8) symmetry_worst>=-1.45531 64   0 M (1.000000000 0.000000000) *
##         9) symmetry_worst< -1.45531 158  34 M (0.784810127 0.215189873)  
##          18) smoothness_worst< -1.437613 101  11 M (0.891089109 0.108910891)  
##            36) smoothness_worst< -1.483884 32   0 M (1.000000000 0.000000000) *
##            37) smoothness_worst>=-1.483884 69  11 M (0.840579710 0.159420290)  
##              74) smoothness_worst>=-1.482699 66   8 M (0.878787879 0.121212121) *
##              75) smoothness_worst< -1.482699 3   0 B (0.000000000 1.000000000) *
##          19) smoothness_worst>=-1.437613 57  23 M (0.596491228 0.403508772)  
##            38) smoothness_mean>=-2.292155 42  10 M (0.761904762 0.238095238)  
##              76) smoothness_mean< -2.093138 36   5 M (0.861111111 0.138888889) *
##              77) smoothness_mean>=-2.093138 6   1 B (0.166666667 0.833333333) *
##            39) smoothness_mean< -2.292155 15   2 B (0.133333333 0.866666667)  
##              78) texture_mean>=3.075523 3   1 M (0.666666667 0.333333333) *
##              79) texture_mean< 3.075523 12   0 B (0.000000000 1.000000000) *
##       5) texture_mean< 2.931727 158  53 B (0.335443038 0.664556962)  
##        10) compactness_se>=-3.891799 76  29 M (0.618421053 0.381578947)  
##          20) symmetry_worst>=-1.668672 45   6 M (0.866666667 0.133333333)  
##            40) smoothness_mean< -1.889548 43   4 M (0.906976744 0.093023256)  
##              80) compactness_se< -2.646661 42   3 M (0.928571429 0.071428571) *
##              81) compactness_se>=-2.646661 1   0 B (0.000000000 1.000000000) *
##            41) smoothness_mean>=-1.889548 2   0 B (0.000000000 1.000000000) *
##          21) symmetry_worst< -1.668672 31   8 B (0.258064516 0.741935484)  
##            42) compactness_se< -3.854964 5   0 M (1.000000000 0.000000000) *
##            43) compactness_se>=-3.854964 26   3 B (0.115384615 0.884615385)  
##              86) smoothness_worst< -1.493233 3   1 M (0.666666667 0.333333333) *
##              87) smoothness_worst>=-1.493233 23   1 B (0.043478261 0.956521739) *
##        11) compactness_se< -3.891799 82   6 B (0.073170732 0.926829268)  
##          22) smoothness_worst< -1.49885 3   0 M (1.000000000 0.000000000) *
##          23) smoothness_worst>=-1.49885 79   3 B (0.037974684 0.962025316)  
##            46) texture_worst>=4.68139 3   1 M (0.666666667 0.333333333)  
##              92) texture_mean>=2.892522 2   0 M (1.000000000 0.000000000) *
##              93) texture_mean< 2.892522 1   0 B (0.000000000 1.000000000) *
##            47) texture_worst< 4.68139 76   1 B (0.013157895 0.986842105)  
##              94) compactness_se>=-3.970723 12   1 B (0.083333333 0.916666667) *
##              95) compactness_se< -3.970723 64   0 B (0.000000000 1.000000000) *
##     3) smoothness_worst< -1.500665 532 104 B (0.195488722 0.804511278)  
##       6) texture_mean>=3.007414 171  72 B (0.421052632 0.578947368)  
##        12) compactness_se>=-3.021724 22   2 M (0.909090909 0.090909091)  
##          24) texture_mean>=3.038537 20   0 M (1.000000000 0.000000000) *
##          25) texture_mean< 3.038537 2   0 B (0.000000000 1.000000000) *
##        13) compactness_se< -3.021724 149  52 B (0.348993289 0.651006711)  
##          26) smoothness_mean>=-2.508076 113  52 B (0.460176991 0.539823009)  
##            52) symmetry_worst>=-1.527595 16   2 M (0.875000000 0.125000000)  
##             104) smoothness_worst< -1.513943 14   0 M (1.000000000 0.000000000) *
##             105) smoothness_worst>=-1.513943 2   0 B (0.000000000 1.000000000) *
##            53) symmetry_worst< -1.527595 97  38 B (0.391752577 0.608247423)  
##             106) smoothness_mean< -2.503847 6   0 M (1.000000000 0.000000000) *
##             107) smoothness_mean>=-2.503847 91  32 B (0.351648352 0.648351648) *
##          27) smoothness_mean< -2.508076 36   0 B (0.000000000 1.000000000) *
##       7) texture_mean< 3.007414 361  32 B (0.088642659 0.911357341)  
##        14) texture_worst>=4.888103 3   0 M (1.000000000 0.000000000) *
##        15) texture_worst< 4.888103 358  29 B (0.081005587 0.918994413)  
##          30) compactness_se>=-3.953942 104  18 B (0.173076923 0.826923077)  
##            60) compactness_se< -3.48221 39  17 B (0.435897436 0.564102564)  
##             120) compactness_se>=-3.623844 16   3 M (0.812500000 0.187500000) *
##             121) compactness_se< -3.623844 23   4 B (0.173913043 0.826086957) *
##            61) compactness_se>=-3.48221 65   1 B (0.015384615 0.984615385)  
##             122) texture_mean>=2.984668 9   1 B (0.111111111 0.888888889) *
##             123) texture_mean< 2.984668 56   0 B (0.000000000 1.000000000) *
##          31) compactness_se< -3.953942 254  11 B (0.043307087 0.956692913)  
##            62) texture_mean>=2.947329 28   6 B (0.214285714 0.785714286)  
##             124) smoothness_worst>=-1.55438 6   1 M (0.833333333 0.166666667) *
##             125) smoothness_worst< -1.55438 22   1 B (0.045454545 0.954545455) *
##            63) texture_mean< 2.947329 226   5 B (0.022123894 0.977876106)  
##             126) texture_worst>=4.389974 50   4 B (0.080000000 0.920000000) *
##             127) texture_worst< 4.389974 176   1 B (0.005681818 0.994318182) *
## 
## $trees[[2]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 382 B (0.41885965 0.58114035)  
##     2) texture_mean>=2.927988 495 174 M (0.64848485 0.35151515)  
##       4) smoothness_mean>=-2.425205 352  76 M (0.78409091 0.21590909)  
##         8) compactness_se>=-3.797621 230  26 M (0.88695652 0.11304348)  
##          16) symmetry_worst>=-2.184494 209  12 M (0.94258373 0.05741627)  
##            32) texture_worst>=4.35267 207  10 M (0.95169082 0.04830918)  
##              64) texture_mean< 3.36829 199   7 M (0.96482412 0.03517588) *
##              65) texture_mean>=3.36829 8   3 M (0.62500000 0.37500000) *
##            33) texture_worst< 4.35267 2   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst< -2.184494 21   7 B (0.33333333 0.66666667)  
##            34) symmetry_worst< -2.271177 7   0 M (1.00000000 0.00000000) *
##            35) symmetry_worst>=-2.271177 14   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -3.797621 122  50 M (0.59016393 0.40983607)  
##          18) texture_mean>=3.057767 57   9 M (0.84210526 0.15789474)  
##            36) texture_worst>=4.527762 53   5 M (0.90566038 0.09433962)  
##              72) compactness_se< -3.872601 52   4 M (0.92307692 0.07692308) *
##              73) compactness_se>=-3.872601 1   0 B (0.00000000 1.00000000) *
##            37) texture_worst< 4.527762 4   0 B (0.00000000 1.00000000) *
##          19) texture_mean< 3.057767 65  24 B (0.36923077 0.63076923)  
##            38) smoothness_mean< -2.309577 26   8 M (0.69230769 0.30769231)  
##              76) smoothness_worst>=-1.514694 15   0 M (1.00000000 0.00000000) *
##              77) smoothness_worst< -1.514694 11   3 B (0.27272727 0.72727273) *
##            39) smoothness_mean>=-2.309577 39   6 B (0.15384615 0.84615385)  
##              78) smoothness_mean>=-2.244788 12   5 B (0.41666667 0.58333333) *
##              79) smoothness_mean< -2.244788 27   1 B (0.03703704 0.96296296) *
##       5) smoothness_mean< -2.425205 143  45 B (0.31468531 0.68531469)  
##        10) symmetry_worst>=-1.695215 47  22 M (0.53191489 0.46808511)  
##          20) compactness_se< -4.088469 28   6 M (0.78571429 0.21428571)  
##            40) texture_mean>=2.964399 22   0 M (1.00000000 0.00000000) *
##            41) texture_mean< 2.964399 6   0 B (0.00000000 1.00000000) *
##          21) compactness_se>=-4.088469 19   3 B (0.15789474 0.84210526)  
##            42) texture_worst>=5.003123 3   0 M (1.00000000 0.00000000) *
##            43) texture_worst< 5.003123 16   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -1.695215 96  20 B (0.20833333 0.79166667)  
##          22) smoothness_worst>=-1.556752 24   9 M (0.62500000 0.37500000)  
##            44) texture_mean>=3.061509 19   4 M (0.78947368 0.21052632)  
##              88) compactness_se< -2.942351 16   1 M (0.93750000 0.06250000) *
##              89) compactness_se>=-2.942351 3   0 B (0.00000000 1.00000000) *
##            45) texture_mean< 3.061509 5   0 B (0.00000000 1.00000000) *
##          23) smoothness_worst< -1.556752 72   5 B (0.06944444 0.93055556)  
##            46) compactness_se>=-3.612359 20   4 B (0.20000000 0.80000000)  
##              92) compactness_se< -3.580055 2   0 M (1.00000000 0.00000000) *
##              93) compactness_se>=-3.580055 18   2 B (0.11111111 0.88888889) *
##            47) compactness_se< -3.612359 52   1 B (0.01923077 0.98076923)  
##              94) texture_mean< 2.966301 11   1 B (0.09090909 0.90909091) *
##              95) texture_mean>=2.966301 41   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.927988 417  61 B (0.14628297 0.85371703)  
##       6) symmetry_worst>=-1.36527 29  10 M (0.65517241 0.34482759)  
##        12) texture_worst>=4.373597 13   0 M (1.00000000 0.00000000) *
##        13) texture_worst< 4.373597 16   6 B (0.37500000 0.62500000)  
##          26) compactness_se>=-3.446692 8   2 M (0.75000000 0.25000000)  
##            52) smoothness_mean>=-2.235399 6   0 M (1.00000000 0.00000000) *
##            53) smoothness_mean< -2.235399 2   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -3.446692 8   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.36527 388  42 B (0.10824742 0.89175258)  
##        14) smoothness_mean>=-2.468758 288  42 B (0.14583333 0.85416667)  
##          28) smoothness_mean< -2.467991 3   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-2.467991 285  39 B (0.13684211 0.86315789)  
##            58) texture_mean>=2.656737 211  38 B (0.18009479 0.81990521)  
##             116) texture_mean< 2.666527 4   0 M (1.00000000 0.00000000) *
##             117) texture_mean>=2.666527 207  34 B (0.16425121 0.83574879) *
##            59) texture_mean< 2.656737 74   1 B (0.01351351 0.98648649)  
##             118) smoothness_mean>=-2.074653 3   1 B (0.33333333 0.66666667) *
##             119) smoothness_mean< -2.074653 71   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.468758 100   0 B (0.00000000 1.00000000) *
## 
## $trees[[3]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 415 B (0.45504386 0.54495614)  
##     2) texture_worst>=4.275472 677 300 M (0.55686854 0.44313146)  
##       4) smoothness_worst>=-1.556752 487 165 M (0.66119097 0.33880903)  
##         8) symmetry_worst>=-2.027922 424 123 M (0.70990566 0.29009434)  
##          16) smoothness_mean>=-2.469882 410 110 M (0.73170732 0.26829268)  
##            32) symmetry_worst>=-1.329407 41   0 M (1.00000000 0.00000000) *
##            33) symmetry_worst< -1.329407 369 110 M (0.70189702 0.29810298)  
##              66) smoothness_worst< -1.51308 116  19 M (0.83620690 0.16379310) *
##              67) smoothness_worst>=-1.51308 253  91 M (0.64031621 0.35968379) *
##          17) smoothness_mean< -2.469882 14   1 B (0.07142857 0.92857143)  
##            34) compactness_se>=-3.935569 1   0 M (1.00000000 0.00000000) *
##            35) compactness_se< -3.935569 13   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -2.027922 63  21 B (0.33333333 0.66666667)  
##          18) texture_worst>=4.583884 42  21 M (0.50000000 0.50000000)  
##            36) texture_worst< 5.117452 29   8 M (0.72413793 0.27586207)  
##              72) compactness_se>=-4.170636 24   4 M (0.83333333 0.16666667) *
##              73) compactness_se< -4.170636 5   1 B (0.20000000 0.80000000) *
##            37) texture_worst>=5.117452 13   0 B (0.00000000 1.00000000) *
##          19) texture_worst< 4.583884 21   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.556752 190  55 B (0.28947368 0.71052632)  
##        10) symmetry_worst< -2.227786 23   7 M (0.69565217 0.30434783)  
##          20) smoothness_mean>=-2.484324 18   2 M (0.88888889 0.11111111)  
##            40) texture_mean< 3.379986 17   1 M (0.94117647 0.05882353)  
##              80) smoothness_mean>=-2.401038 12   0 M (1.00000000 0.00000000) *
##              81) smoothness_mean< -2.401038 5   1 M (0.80000000 0.20000000) *
##            41) texture_mean>=3.379986 1   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean< -2.484324 5   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst>=-2.227786 167  39 B (0.23353293 0.76646707)  
##          22) symmetry_worst>=-1.238986 6   0 M (1.00000000 0.00000000) *
##          23) symmetry_worst< -1.238986 161  33 B (0.20496894 0.79503106)  
##            46) texture_worst>=5.066446 15   6 M (0.60000000 0.40000000)  
##              92) texture_mean< 3.332536 6   0 M (1.00000000 0.00000000) *
##              93) texture_mean>=3.332536 9   3 B (0.33333333 0.66666667) *
##            47) texture_worst< 5.066446 146  24 B (0.16438356 0.83561644)  
##              94) smoothness_worst< -1.720903 7   2 M (0.71428571 0.28571429) *
##              95) smoothness_worst>=-1.720903 139  19 B (0.13669065 0.86330935) *
##     3) texture_worst< 4.275472 235  38 B (0.16170213 0.83829787)  
##       6) compactness_se>=-3.958868 105  38 B (0.36190476 0.63809524)  
##        12) symmetry_worst>=-1.42974 28  10 M (0.64285714 0.35714286)  
##          24) compactness_se>=-3.391558 13   0 M (1.00000000 0.00000000) *
##          25) compactness_se< -3.391558 15   5 B (0.33333333 0.66666667)  
##            50) smoothness_mean< -2.393992 5   0 M (1.00000000 0.00000000) *
##            51) smoothness_mean>=-2.393992 10   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst< -1.42974 77  20 B (0.25974026 0.74025974)  
##          26) compactness_se< -3.492332 43  19 B (0.44186047 0.55813953)  
##            52) compactness_se>=-3.764682 26   9 M (0.65384615 0.34615385)  
##             104) texture_mean< 2.716348 14   0 M (1.00000000 0.00000000) *
##             105) texture_mean>=2.716348 12   3 B (0.25000000 0.75000000) *
##            53) compactness_se< -3.764682 17   2 B (0.11764706 0.88235294)  
##             106) texture_worst>=4.134459 4   2 M (0.50000000 0.50000000) *
##             107) texture_worst< 4.134459 13   0 B (0.00000000 1.00000000) *
##          27) compactness_se>=-3.492332 34   1 B (0.02941176 0.97058824)  
##            54) compactness_se< -3.48221 6   1 B (0.16666667 0.83333333)  
##             108) texture_mean< 2.787307 1   0 M (1.00000000 0.00000000) *
##             109) texture_mean>=2.787307 5   0 B (0.00000000 1.00000000) *
##            55) compactness_se>=-3.48221 28   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.958868 130   0 B (0.00000000 1.00000000) *
## 
## $trees[[4]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 364 B (0.39912281 0.60087719)  
##     2) compactness_se>=-3.721197 380 152 M (0.60000000 0.40000000)  
##       4) texture_mean>=3.054236 164  40 M (0.75609756 0.24390244)  
##         8) symmetry_worst>=-2.029591 130  20 M (0.84615385 0.15384615)  
##          16) smoothness_mean>=-2.41714 98   6 M (0.93877551 0.06122449)  
##            32) smoothness_mean< -2.105484 92   3 M (0.96739130 0.03260870)  
##              64) smoothness_worst>=-1.609426 91   2 M (0.97802198 0.02197802) *
##              65) smoothness_worst< -1.609426 1   0 B (0.00000000 1.00000000) *
##            33) smoothness_mean>=-2.105484 6   3 M (0.50000000 0.50000000)  
##              66) texture_mean< 3.186512 3   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=3.186512 3   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.41714 32  14 M (0.56250000 0.43750000)  
##            34) smoothness_mean< -2.453967 19   1 M (0.94736842 0.05263158)  
##              68) smoothness_worst>=-1.612487 18   0 M (1.00000000 0.00000000) *
##              69) smoothness_worst< -1.612487 1   0 B (0.00000000 1.00000000) *
##            35) smoothness_mean>=-2.453967 13   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -2.029591 34  14 B (0.41176471 0.58823529)  
##          18) texture_worst< 4.645452 10   0 M (1.00000000 0.00000000) *
##          19) texture_worst>=4.645452 24   4 B (0.16666667 0.83333333)  
##            38) symmetry_worst< -2.242858 6   2 M (0.66666667 0.33333333)  
##              76) texture_mean>=3.208081 4   0 M (1.00000000 0.00000000) *
##              77) texture_mean< 3.208081 2   0 B (0.00000000 1.00000000) *
##            39) symmetry_worst>=-2.242858 18   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 3.054236 216 104 B (0.48148148 0.51851852)  
##        10) compactness_se< -3.427747 107  31 M (0.71028037 0.28971963)  
##          20) texture_mean>=2.628033 99  23 M (0.76767677 0.23232323)  
##            40) texture_mean< 3.040702 92  16 M (0.82608696 0.17391304)  
##              80) smoothness_mean>=-2.443631 81   8 M (0.90123457 0.09876543) *
##              81) smoothness_mean< -2.443631 11   3 B (0.27272727 0.72727273) *
##            41) texture_mean>=3.040702 7   0 B (0.00000000 1.00000000) *
##          21) texture_mean< 2.628033 8   0 B (0.00000000 1.00000000) *
##        11) compactness_se>=-3.427747 109  28 B (0.25688073 0.74311927)  
##          22) symmetry_worst>=-1.300369 16   1 M (0.93750000 0.06250000)  
##            44) compactness_se< -2.524297 15   0 M (1.00000000 0.00000000) *
##            45) compactness_se>=-2.524297 1   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst< -1.300369 93  13 B (0.13978495 0.86021505)  
##            46) symmetry_worst>=-1.853888 48  13 B (0.27083333 0.72916667)  
##              92) texture_mean>=2.96681 19   9 M (0.52631579 0.47368421) *
##              93) texture_mean< 2.96681 29   3 B (0.10344828 0.89655172) *
##            47) symmetry_worst< -1.853888 45   0 B (0.00000000 1.00000000) *
##     3) compactness_se< -3.721197 532 136 B (0.25563910 0.74436090)  
##       6) texture_worst>=4.36289 386 128 B (0.33160622 0.66839378)  
##        12) smoothness_worst>=-1.424105 30   5 M (0.83333333 0.16666667)  
##          24) smoothness_mean>=-2.397334 25   0 M (1.00000000 0.00000000) *
##          25) smoothness_mean< -2.397334 5   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.424105 356 103 B (0.28932584 0.71067416)  
##          26) smoothness_mean< -2.260964 281  99 B (0.35231317 0.64768683)  
##            52) smoothness_mean>=-2.272056 8   0 M (1.00000000 0.00000000) *
##            53) smoothness_mean< -2.272056 273  91 B (0.33333333 0.66666667)  
##             106) compactness_se< -4.039628 175  74 B (0.42285714 0.57714286) *
##             107) compactness_se>=-4.039628 98  17 B (0.17346939 0.82653061) *
##          27) smoothness_mean>=-2.260964 75   4 B (0.05333333 0.94666667)  
##            54) texture_mean< 2.844609 2   1 M (0.50000000 0.50000000)  
##             108) texture_mean>=2.831705 1   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 2.831705 1   0 B (0.00000000 1.00000000) *
##            55) texture_mean>=2.844609 73   3 B (0.04109589 0.95890411)  
##             110) symmetry_worst>=-1.611386 23   3 B (0.13043478 0.86956522) *
##             111) symmetry_worst< -1.611386 50   0 B (0.00000000 1.00000000) *
##       7) texture_worst< 4.36289 146   8 B (0.05479452 0.94520548)  
##        14) symmetry_worst>=-1.428729 6   2 M (0.66666667 0.33333333)  
##          28) texture_mean>=2.772337 4   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.772337 2   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.428729 140   4 B (0.02857143 0.97142857)  
##          30) smoothness_mean>=-2.081877 2   0 M (1.00000000 0.00000000) *
##          31) smoothness_mean< -2.081877 138   2 B (0.01449275 0.98550725)  
##            62) compactness_se>=-3.892047 15   2 B (0.13333333 0.86666667)  
##             124) compactness_se< -3.878107 2   0 M (1.00000000 0.00000000) *
##             125) compactness_se>=-3.878107 13   0 B (0.00000000 1.00000000) *
##            63) compactness_se< -3.892047 123   0 B (0.00000000 1.00000000) *
## 
## $trees[[5]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 444 B (0.48684211 0.51315789)  
##     2) texture_mean>=2.811204 751 326 M (0.56591212 0.43408788)  
##       4) compactness_se>=-4.779408 733 308 M (0.57980900 0.42019100)  
##         8) texture_mean>=3.116842 176  46 M (0.73863636 0.26136364)  
##          16) smoothness_mean>=-2.489159 154  30 M (0.80519481 0.19480519)  
##            32) compactness_se>=-4.543049 146  22 M (0.84931507 0.15068493)  
##              64) smoothness_mean< -2.099273 141  17 M (0.87943262 0.12056738) *
##              65) smoothness_mean>=-2.099273 5   0 B (0.00000000 1.00000000) *
##            33) compactness_se< -4.543049 8   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.489159 22   6 B (0.27272727 0.72727273)  
##            34) texture_mean< 3.190706 6   0 M (1.00000000 0.00000000) *
##            35) texture_mean>=3.190706 16   0 B (0.00000000 1.00000000) *
##         9) texture_mean< 3.116842 557 262 M (0.52962298 0.47037702)  
##          18) symmetry_worst>=-1.107986 24   0 M (1.00000000 0.00000000) *
##          19) symmetry_worst< -1.107986 533 262 M (0.50844278 0.49155722)  
##            38) symmetry_worst>=-1.862978 386 170 M (0.55958549 0.44041451)  
##              76) texture_mean< 3.11507 374 158 M (0.57754011 0.42245989) *
##              77) texture_mean>=3.11507 12   0 B (0.00000000 1.00000000) *
##            39) symmetry_worst< -1.862978 147  55 B (0.37414966 0.62585034)  
##              78) texture_worst>=4.905691 8   0 M (1.00000000 0.00000000) *
##              79) texture_worst< 4.905691 139  47 B (0.33812950 0.66187050) *
##       5) compactness_se< -4.779408 18   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.811204 161  19 B (0.11801242 0.88198758)  
##       6) symmetry_worst>=-1.281003 5   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst< -1.281003 156  14 B (0.08974359 0.91025641)  
##        14) smoothness_mean>=-1.977294 4   0 M (1.00000000 0.00000000) *
##        15) smoothness_mean< -1.977294 152  10 B (0.06578947 0.93421053)  
##          30) smoothness_mean>=-2.321264 59   9 B (0.15254237 0.84745763)  
##            60) smoothness_mean< -2.287239 22   9 B (0.40909091 0.59090909)  
##             120) texture_worst>=4.138116 5   0 M (1.00000000 0.00000000) *
##             121) texture_worst< 4.138116 17   4 B (0.23529412 0.76470588) *
##            61) smoothness_mean>=-2.287239 37   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean< -2.321264 93   1 B (0.01075269 0.98924731)  
##            62) compactness_se>=-3.488718 12   1 B (0.08333333 0.91666667)  
##             124) compactness_se< -3.483667 1   0 M (1.00000000 0.00000000) *
##             125) compactness_se>=-3.483667 11   0 B (0.00000000 1.00000000) *
##            63) compactness_se< -3.488718 81   0 B (0.00000000 1.00000000) *
## 
## $trees[[6]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 393 B (0.43092105 0.56907895)  
##     2) texture_mean>=3.058002 294 107 M (0.63605442 0.36394558)  
##       4) symmetry_worst>=-1.71268 124  21 M (0.83064516 0.16935484)  
##         8) texture_worst>=4.818867 96   7 M (0.92708333 0.07291667)  
##          16) smoothness_mean>=-2.509617 95   6 M (0.93684211 0.06315789)  
##            32) smoothness_mean>=-2.334545 54   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean< -2.334545 41   6 M (0.85365854 0.14634146)  
##              66) smoothness_mean< -2.347634 38   3 M (0.92105263 0.07894737) *
##              67) smoothness_mean>=-2.347634 3   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.509617 1   0 B (0.00000000 1.00000000) *
##         9) texture_worst< 4.818867 28  14 M (0.50000000 0.50000000)  
##          18) texture_worst< 4.790105 13   0 M (1.00000000 0.00000000) *
##          19) texture_worst>=4.790105 15   1 B (0.06666667 0.93333333)  
##            38) smoothness_mean< -2.321477 1   0 M (1.00000000 0.00000000) *
##            39) smoothness_mean>=-2.321477 14   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst< -1.71268 170  84 B (0.49411765 0.50588235)  
##        10) symmetry_worst< -1.733593 145  61 M (0.57931034 0.42068966)  
##          20) smoothness_worst>=-1.603555 118  38 M (0.67796610 0.32203390)  
##            40) symmetry_worst>=-2.184494 91  20 M (0.78021978 0.21978022)  
##              80) smoothness_worst< -1.415354 81  11 M (0.86419753 0.13580247) *
##              81) smoothness_worst>=-1.415354 10   1 B (0.10000000 0.90000000) *
##            41) symmetry_worst< -2.184494 27   9 B (0.33333333 0.66666667)  
##              82) smoothness_mean< -2.437515 6   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean>=-2.437515 21   3 B (0.14285714 0.85714286) *
##          21) smoothness_worst< -1.603555 27   4 B (0.14814815 0.85185185)  
##            42) smoothness_mean>=-2.373736 3   0 M (1.00000000 0.00000000) *
##            43) smoothness_mean< -2.373736 24   1 B (0.04166667 0.95833333)  
##              86) texture_worst< 4.508695 2   1 M (0.50000000 0.50000000) *
##              87) texture_worst>=4.508695 22   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst>=-1.733593 25   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 3.058002 618 206 B (0.33333333 0.66666667)  
##       6) texture_mean>=2.709047 536 200 B (0.37313433 0.62686567)  
##        12) smoothness_worst>=-1.374428 14   1 M (0.92857143 0.07142857)  
##          24) symmetry_worst>=-1.846189 13   0 M (1.00000000 0.00000000) *
##          25) symmetry_worst< -1.846189 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.374428 522 187 B (0.35823755 0.64176245)  
##          26) compactness_se>=-4.717333 489 186 B (0.38036810 0.61963190)  
##            52) compactness_se< -4.594248 32   9 M (0.71875000 0.28125000)  
##             104) smoothness_worst< -1.54201 27   4 M (0.85185185 0.14814815) *
##             105) smoothness_worst>=-1.54201 5   0 B (0.00000000 1.00000000) *
##            53) compactness_se>=-4.594248 457 163 B (0.35667396 0.64332604)  
##             106) smoothness_mean>=-2.434347 375 150 B (0.40000000 0.60000000) *
##             107) smoothness_mean< -2.434347 82  13 B (0.15853659 0.84146341) *
##          27) compactness_se< -4.717333 33   1 B (0.03030303 0.96969697)  
##            54) texture_mean>=2.991714 1   0 M (1.00000000 0.00000000) *
##            55) texture_mean< 2.991714 32   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 2.709047 82   6 B (0.07317073 0.92682927)  
##        14) symmetry_worst>=-1.122487 2   0 M (1.00000000 0.00000000) *
##        15) symmetry_worst< -1.122487 80   4 B (0.05000000 0.95000000)  
##          30) texture_mean< 2.487336 8   2 B (0.25000000 0.75000000)  
##            60) texture_mean>=2.434062 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.434062 6   0 B (0.00000000 1.00000000) *
##          31) texture_mean>=2.487336 72   2 B (0.02777778 0.97222222)  
##            62) symmetry_worst< -2.111279 14   2 B (0.14285714 0.85714286)  
##             124) smoothness_worst>=-1.49704 2   0 M (1.00000000 0.00000000) *
##             125) smoothness_worst< -1.49704 12   0 B (0.00000000 1.00000000) *
##            63) symmetry_worst>=-2.111279 58   0 B (0.00000000 1.00000000) *
## 
## $trees[[7]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 445 M (0.51206140 0.48793860)  
##    2) texture_mean>=2.707375 842 379 M (0.54988124 0.45011876)  
##      4) compactness_se>=-4.706178 803 340 M (0.57658780 0.42341220)  
##        8) symmetry_worst>=-1.329407 50   7 M (0.86000000 0.14000000)  
##         16) texture_mean< 3.099059 39   1 M (0.97435897 0.02564103)  
##           32) texture_mean>=2.756192 38   0 M (1.00000000 0.00000000) *
##           33) texture_mean< 2.756192 1   0 B (0.00000000 1.00000000) *
##         17) texture_mean>=3.099059 11   5 B (0.45454545 0.54545455)  
##           34) texture_mean>=3.141437 5   0 M (1.00000000 0.00000000) *
##           35) texture_mean< 3.141437 6   0 B (0.00000000 1.00000000) *
##        9) symmetry_worst< -1.329407 753 333 M (0.55776892 0.44223108)  
##         18) smoothness_mean>=-2.546123 734 315 M (0.57084469 0.42915531)  
##           36) symmetry_worst< -1.925345 193  63 M (0.67357513 0.32642487)  
##             72) smoothness_worst>=-1.604472 169  44 M (0.73964497 0.26035503) *
##             73) smoothness_worst< -1.604472 24   5 B (0.20833333 0.79166667) *
##           37) symmetry_worst>=-1.925345 541 252 M (0.53419593 0.46580407)  
##             74) smoothness_worst< -1.596198 49   8 M (0.83673469 0.16326531) *
##             75) smoothness_worst>=-1.596198 492 244 M (0.50406504 0.49593496) *
##         19) smoothness_mean< -2.546123 19   1 B (0.05263158 0.94736842)  
##           38) smoothness_worst< -1.720903 4   1 B (0.25000000 0.75000000)  
##             76) compactness_se>=-3.013033 1   0 M (1.00000000 0.00000000) *
##             77) compactness_se< -3.013033 3   0 B (0.00000000 1.00000000) *
##           39) smoothness_worst>=-1.720903 15   0 B (0.00000000 1.00000000) *
##      5) compactness_se< -4.706178 39   0 B (0.00000000 1.00000000) *
##    3) texture_mean< 2.707375 70   4 B (0.05714286 0.94285714)  
##      6) symmetry_worst>=-1.15097 2   0 M (1.00000000 0.00000000) *
##      7) symmetry_worst< -1.15097 68   2 B (0.02941176 0.97058824)  
##       14) smoothness_mean>=-2.074653 3   1 M (0.66666667 0.33333333)  
##         28) texture_mean>=2.434062 2   0 M (1.00000000 0.00000000) *
##         29) texture_mean< 2.434062 1   0 B (0.00000000 1.00000000) *
##       15) smoothness_mean< -2.074653 65   0 B (0.00000000 1.00000000) *
## 
## $trees[[8]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 429 B (0.47039474 0.52960526)  
##    2) texture_mean>=2.709047 846 423 M (0.50000000 0.50000000)  
##      4) symmetry_worst>=-1.329407 52   6 M (0.88461538 0.11538462)  
##        8) texture_mean>=2.742062 50   4 M (0.92000000 0.08000000)  
##         16) texture_mean< 3.10949 38   0 M (1.00000000 0.00000000) *
##         17) texture_mean>=3.10949 12   4 M (0.66666667 0.33333333)  
##           34) texture_mean>=3.126045 8   0 M (1.00000000 0.00000000) *
##           35) texture_mean< 3.126045 4   0 B (0.00000000 1.00000000) *
##        9) texture_mean< 2.742062 2   0 B (0.00000000 1.00000000) *
##      5) symmetry_worst< -1.329407 794 377 B (0.47481108 0.52518892)  
##       10) compactness_se>=-4.705732 765 377 B (0.49281046 0.50718954)  
##         20) compactness_se< -4.448167 113  30 M (0.73451327 0.26548673)  
##           40) smoothness_mean< -2.295268 101  18 M (0.82178218 0.17821782)  
##             80) symmetry_worst< -1.478833 95  12 M (0.87368421 0.12631579) *
##             81) symmetry_worst>=-1.478833 6   0 B (0.00000000 1.00000000) *
##           41) smoothness_mean>=-2.295268 12   0 B (0.00000000 1.00000000) *
##         21) compactness_se>=-4.448167 652 294 B (0.45092025 0.54907975)  
##           42) texture_worst< 4.642157 366 177 M (0.51639344 0.48360656)  
##             84) smoothness_worst>=-1.456304 54   7 M (0.87037037 0.12962963) *
##             85) smoothness_worst< -1.456304 312 142 B (0.45512821 0.54487179) *
##           43) texture_worst>=4.642157 286 105 B (0.36713287 0.63286713)  
##             86) texture_worst>=5.03133 54  21 M (0.61111111 0.38888889) *
##             87) texture_worst< 5.03133 232  72 B (0.31034483 0.68965517) *
##       11) compactness_se< -4.705732 29   0 B (0.00000000 1.00000000) *
##    3) texture_mean< 2.709047 66   6 B (0.09090909 0.90909091)  
##      6) symmetry_worst>=-1.122487 3   0 M (1.00000000 0.00000000) *
##      7) symmetry_worst< -1.122487 63   3 B (0.04761905 0.95238095)  
##       14) smoothness_mean>=-2.074653 10   2 B (0.20000000 0.80000000)  
##         28) smoothness_mean< -2.060513 2   0 M (1.00000000 0.00000000) *
##         29) smoothness_mean>=-2.060513 8   0 B (0.00000000 1.00000000) *
##       15) smoothness_mean< -2.074653 53   1 B (0.01886792 0.98113208)  
##         30) symmetry_worst< -2.105665 6   1 B (0.16666667 0.83333333)  
##           60) smoothness_mean>=-2.312057 1   0 M (1.00000000 0.00000000) *
##           61) smoothness_mean< -2.312057 5   0 B (0.00000000 1.00000000) *
##         31) symmetry_worst>=-2.105665 47   0 B (0.00000000 1.00000000) *
## 
## $trees[[9]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 B (0.48464912 0.51535088)  
##     2) texture_mean>=2.824054 787 373 M (0.52604828 0.47395172)  
##       4) texture_worst< 4.982753 644 274 M (0.57453416 0.42546584)  
##         8) compactness_se>=-4.705732 622 254 M (0.59163987 0.40836013)  
##          16) smoothness_mean< -2.352368 300  93 M (0.69000000 0.31000000)  
##            32) compactness_se< -4.099264 144  24 M (0.83333333 0.16666667)  
##              64) texture_mean>=2.947329 93   6 M (0.93548387 0.06451613) *
##              65) texture_mean< 2.947329 51  18 M (0.64705882 0.35294118) *
##            33) compactness_se>=-4.099264 156  69 M (0.55769231 0.44230769)  
##              66) smoothness_mean>=-2.394871 63  10 M (0.84126984 0.15873016) *
##              67) smoothness_mean< -2.394871 93  34 B (0.36559140 0.63440860) *
##          17) smoothness_mean>=-2.352368 322 161 M (0.50000000 0.50000000)  
##            34) symmetry_worst>=-1.529476 58  12 M (0.79310345 0.20689655)  
##              68) compactness_se>=-4.127915 46   4 M (0.91304348 0.08695652) *
##              69) compactness_se< -4.127915 12   4 B (0.33333333 0.66666667) *
##            35) symmetry_worst< -1.529476 264 115 B (0.43560606 0.56439394)  
##              70) smoothness_mean>=-2.332015 245 115 B (0.46938776 0.53061224) *
##              71) smoothness_mean< -2.332015 19   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -4.705732 22   2 B (0.09090909 0.90909091)  
##          18) symmetry_worst>=-1.179946 2   0 M (1.00000000 0.00000000) *
##          19) symmetry_worst< -1.179946 20   0 B (0.00000000 1.00000000) *
##       5) texture_worst>=4.982753 143  44 B (0.30769231 0.69230769)  
##        10) compactness_se>=-3.334337 23   6 M (0.73913043 0.26086957)  
##          20) texture_worst>=4.998431 17   0 M (1.00000000 0.00000000) *
##          21) texture_worst< 4.998431 6   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -3.334337 120  27 B (0.22500000 0.77500000)  
##          22) smoothness_mean>=-2.362094 47  19 B (0.40425532 0.59574468)  
##            44) smoothness_worst< -1.450409 22   5 M (0.77272727 0.22727273)  
##              88) symmetry_worst>=-2.207988 17   0 M (1.00000000 0.00000000) *
##              89) symmetry_worst< -2.207988 5   0 B (0.00000000 1.00000000) *
##            45) smoothness_worst>=-1.450409 25   2 B (0.08000000 0.92000000)  
##              90) symmetry_worst>=-1.24413 1   0 M (1.00000000 0.00000000) *
##              91) symmetry_worst< -1.24413 24   1 B (0.04166667 0.95833333) *
##          23) smoothness_mean< -2.362094 73   8 B (0.10958904 0.89041096)  
##            46) symmetry_worst>=-1.554775 3   0 M (1.00000000 0.00000000) *
##            47) symmetry_worst< -1.554775 70   5 B (0.07142857 0.92857143)  
##              94) texture_worst>=5.636459 1   0 M (1.00000000 0.00000000) *
##              95) texture_worst< 5.636459 69   4 B (0.05797101 0.94202899) *
##     3) texture_mean< 2.824054 125  28 B (0.22400000 0.77600000)  
##       6) compactness_se>=-3.764682 55  26 B (0.47272727 0.52727273)  
##        12) smoothness_mean>=-2.31958 36  13 M (0.63888889 0.36111111)  
##          24) texture_worst>=3.973898 24   4 M (0.83333333 0.16666667)  
##            48) compactness_se< -3.364454 14   0 M (1.00000000 0.00000000) *
##            49) compactness_se>=-3.364454 10   4 M (0.60000000 0.40000000)  
##              98) symmetry_worst>=-1.316602 6   0 M (1.00000000 0.00000000) *
##              99) symmetry_worst< -1.316602 4   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 3.973898 12   3 B (0.25000000 0.75000000)  
##            50) compactness_se< -3.688804 2   0 M (1.00000000 0.00000000) *
##            51) compactness_se>=-3.688804 10   1 B (0.10000000 0.90000000)  
##             102) texture_mean< 2.366153 1   0 M (1.00000000 0.00000000) *
##             103) texture_mean>=2.366153 9   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.31958 19   3 B (0.15789474 0.84210526)  
##          26) symmetry_worst< -1.982852 3   1 M (0.66666667 0.33333333)  
##            52) texture_mean>=2.763153 2   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 2.763153 1   0 B (0.00000000 1.00000000) *
##          27) symmetry_worst>=-1.982852 16   1 B (0.06250000 0.93750000)  
##            54) smoothness_worst>=-1.493125 2   1 M (0.50000000 0.50000000)  
##             108) texture_mean< 2.774841 1   0 M (1.00000000 0.00000000) *
##             109) texture_mean>=2.774841 1   0 B (0.00000000 1.00000000) *
##            55) smoothness_worst< -1.493125 14   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.764682 70   2 B (0.02857143 0.97142857)  
##        14) symmetry_worst>=-1.431268 3   1 B (0.33333333 0.66666667)  
##          28) texture_mean>=2.799919 1   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.799919 2   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.431268 67   1 B (0.01492537 0.98507463)  
##          30) compactness_se>=-3.894783 8   1 B (0.12500000 0.87500000)  
##            60) compactness_se< -3.866661 1   0 M (1.00000000 0.00000000) *
##            61) compactness_se>=-3.866661 7   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.894783 59   0 B (0.00000000 1.00000000) *
## 
## $trees[[10]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 438 M (0.51973684 0.48026316)  
##     2) texture_worst>=4.260219 794 348 M (0.56171285 0.43828715)  
##       4) compactness_se< -2.927016 740 305 M (0.58783784 0.41216216)  
##         8) compactness_se>=-3.322182 70   9 M (0.87142857 0.12857143)  
##          16) smoothness_worst>=-1.507356 35   0 M (1.00000000 0.00000000) *
##          17) smoothness_worst< -1.507356 35   9 M (0.74285714 0.25714286)  
##            34) smoothness_worst< -1.51411 28   2 M (0.92857143 0.07142857)  
##              68) texture_mean>=2.988153 26   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 2.988153 2   0 B (0.00000000 1.00000000) *
##            35) smoothness_worst>=-1.51411 7   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -3.322182 670 296 M (0.55820896 0.44179104)  
##          18) texture_worst< 4.642157 389 145 M (0.62724936 0.37275064)  
##            36) symmetry_worst< -1.559263 324 102 M (0.68518519 0.31481481)  
##              72) compactness_se>=-4.614925 306  84 M (0.72549020 0.27450980) *
##              73) compactness_se< -4.614925 18   0 B (0.00000000 1.00000000) *
##            37) symmetry_worst>=-1.559263 65  22 B (0.33846154 0.66153846)  
##              74) texture_worst>=4.614159 15   1 M (0.93333333 0.06666667) *
##              75) texture_worst< 4.614159 50   8 B (0.16000000 0.84000000) *
##          19) texture_worst>=4.642157 281 130 B (0.46263345 0.53736655)  
##            38) symmetry_worst>=-1.591238 69  17 M (0.75362319 0.24637681)  
##              76) texture_mean< 3.095125 45   3 M (0.93333333 0.06666667) *
##              77) texture_mean>=3.095125 24  10 B (0.41666667 0.58333333) *
##            39) symmetry_worst< -1.591238 212  78 B (0.36792453 0.63207547)  
##              78) texture_worst>=4.682677 181  78 B (0.43093923 0.56906077) *
##              79) texture_worst< 4.682677 31   0 B (0.00000000 1.00000000) *
##       5) compactness_se>=-2.927016 54  11 B (0.20370370 0.79629630)  
##        10) smoothness_worst>=-1.397207 7   0 M (1.00000000 0.00000000) *
##        11) smoothness_worst< -1.397207 47   4 B (0.08510638 0.91489362)  
##          22) symmetry_worst< -2.040594 3   0 M (1.00000000 0.00000000) *
##          23) symmetry_worst>=-2.040594 44   1 B (0.02272727 0.97727273)  
##            46) texture_mean>=3.063534 5   1 B (0.20000000 0.80000000)  
##              92) texture_mean< 3.166628 1   0 M (1.00000000 0.00000000) *
##              93) texture_mean>=3.166628 4   0 B (0.00000000 1.00000000) *
##            47) texture_mean< 3.063534 39   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.260219 118  28 B (0.23728814 0.76271186)  
##       6) compactness_se>=-3.97985 73  28 B (0.38356164 0.61643836)  
##        12) texture_mean>=2.863053 9   1 M (0.88888889 0.11111111)  
##          24) smoothness_mean>=-2.316299 8   0 M (1.00000000 0.00000000) *
##          25) smoothness_mean< -2.316299 1   0 B (0.00000000 1.00000000) *
##        13) texture_mean< 2.863053 64  20 B (0.31250000 0.68750000)  
##          26) symmetry_worst>=-1.281003 5   0 M (1.00000000 0.00000000) *
##          27) symmetry_worst< -1.281003 59  15 B (0.25423729 0.74576271)  
##            54) compactness_se< -3.866661 7   1 M (0.85714286 0.14285714)  
##             108) texture_worst>=4.110502 6   0 M (1.00000000 0.00000000) *
##             109) texture_worst< 4.110502 1   0 B (0.00000000 1.00000000) *
##            55) compactness_se>=-3.866661 52   9 B (0.17307692 0.82692308)  
##             110) texture_mean< 2.525679 3   0 M (1.00000000 0.00000000) *
##             111) texture_mean>=2.525679 49   6 B (0.12244898 0.87755102) *
##       7) compactness_se< -3.97985 45   0 B (0.00000000 1.00000000) *
## 
## $trees[[11]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 419 M (0.54057018 0.45942982)  
##     2) texture_worst>=4.517889 608 231 M (0.62006579 0.37993421)  
##       4) texture_worst< 4.54138 80   6 M (0.92500000 0.07500000)  
##         8) symmetry_worst< -1.433387 78   4 M (0.94871795 0.05128205)  
##          16) compactness_se< -3.16075 75   2 M (0.97333333 0.02666667)  
##            32) smoothness_mean>=-2.440377 74   1 M (0.98648649 0.01351351)  
##              64) texture_mean< 3.086942 68   0 M (1.00000000 0.00000000) *
##              65) texture_mean>=3.086942 6   1 M (0.83333333 0.16666667) *
##            33) smoothness_mean< -2.440377 1   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-3.16075 3   1 B (0.33333333 0.66666667)  
##            34) texture_mean>=3.023554 1   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.023554 2   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.433387 2   0 B (0.00000000 1.00000000) *
##       5) texture_worst>=4.54138 528 225 M (0.57386364 0.42613636)  
##        10) texture_worst>=4.569492 494 192 M (0.61133603 0.38866397)  
##          20) compactness_se>=-4.505325 423 144 M (0.65957447 0.34042553)  
##            40) smoothness_worst< -1.400053 388 119 M (0.69329897 0.30670103)  
##              80) smoothness_worst>=-1.672049 381 112 M (0.70603675 0.29396325) *
##              81) smoothness_worst< -1.672049 7   0 B (0.00000000 1.00000000) *
##            41) smoothness_worst>=-1.400053 35  10 B (0.28571429 0.71428571)  
##              82) texture_mean>=3.044522 9   1 M (0.88888889 0.11111111) *
##              83) texture_mean< 3.044522 26   2 B (0.07692308 0.92307692) *
##          21) compactness_se< -4.505325 71  23 B (0.32394366 0.67605634)  
##            42) smoothness_worst< -1.549205 36  16 M (0.55555556 0.44444444)  
##              84) symmetry_worst>=-1.909332 26   6 M (0.76923077 0.23076923) *
##              85) symmetry_worst< -1.909332 10   0 B (0.00000000 1.00000000) *
##            43) smoothness_worst>=-1.549205 35   3 B (0.08571429 0.91428571)  
##              86) symmetry_worst< -1.696111 12   3 B (0.25000000 0.75000000) *
##              87) symmetry_worst>=-1.696111 23   0 B (0.00000000 1.00000000) *
##        11) texture_worst< 4.569492 34   1 B (0.02941176 0.97058824)  
##          22) texture_mean>=3.034949 1   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.034949 33   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.517889 304 116 B (0.38157895 0.61842105)  
##       6) compactness_se>=-3.766631 127  59 M (0.53543307 0.46456693)  
##        12) smoothness_mean>=-2.454939 106  40 M (0.62264151 0.37735849)  
##          24) compactness_se< -3.427747 57  12 M (0.78947368 0.21052632)  
##            48) symmetry_worst< -1.461208 52   7 M (0.86538462 0.13461538)  
##              96) texture_worst< 4.460444 50   5 M (0.90000000 0.10000000) *
##              97) texture_worst>=4.460444 2   0 B (0.00000000 1.00000000) *
##            49) symmetry_worst>=-1.461208 5   0 B (0.00000000 1.00000000) *
##          25) compactness_se>=-3.427747 49  21 B (0.42857143 0.57142857)  
##            50) smoothness_mean>=-2.149436 10   1 M (0.90000000 0.10000000)  
##             100) compactness_se>=-3.412571 9   0 M (1.00000000 0.00000000) *
##             101) compactness_se< -3.412571 1   0 B (0.00000000 1.00000000) *
##            51) smoothness_mean< -2.149436 39  12 B (0.30769231 0.69230769)  
##             102) texture_mean>=2.96681 9   1 M (0.88888889 0.11111111) *
##             103) texture_mean< 2.96681 30   4 B (0.13333333 0.86666667) *
##        13) smoothness_mean< -2.454939 21   2 B (0.09523810 0.90476190)  
##          26) texture_mean>=3.038737 2   0 M (1.00000000 0.00000000) *
##          27) texture_mean< 3.038737 19   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.766631 177  48 B (0.27118644 0.72881356)  
##        14) texture_mean>=2.812409 114  47 B (0.41228070 0.58771930)  
##          28) smoothness_worst>=-1.451541 14   0 M (1.00000000 0.00000000) *
##          29) smoothness_worst< -1.451541 100  33 B (0.33000000 0.67000000)  
##            58) smoothness_worst< -1.538735 64  31 M (0.51562500 0.48437500)  
##             116) symmetry_worst< -1.548429 54  21 M (0.61111111 0.38888889) *
##             117) symmetry_worst>=-1.548429 10   0 B (0.00000000 1.00000000) *
##            59) smoothness_worst>=-1.538735 36   0 B (0.00000000 1.00000000) *
##        15) texture_mean< 2.812409 63   1 B (0.01587302 0.98412698)  
##          30) symmetry_worst< -1.930267 13   1 B (0.07692308 0.92307692)  
##            60) symmetry_worst>=-1.95343 1   0 M (1.00000000 0.00000000) *
##            61) symmetry_worst< -1.95343 12   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst>=-1.930267 50   0 B (0.00000000 1.00000000) *
## 
## $trees[[12]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 410 B (0.44956140 0.55043860)  
##     2) smoothness_mean>=-2.424301 659 326 M (0.50531108 0.49468892)  
##       4) smoothness_mean< -2.382983 117  39 M (0.66666667 0.33333333)  
##         8) texture_worst>=4.611234 68   5 M (0.92647059 0.07352941)  
##          16) smoothness_worst>=-1.586424 64   2 M (0.96875000 0.03125000)  
##            32) symmetry_worst>=-2.212871 62   0 M (1.00000000 0.00000000) *
##            33) symmetry_worst< -2.212871 2   0 B (0.00000000 1.00000000) *
##          17) smoothness_worst< -1.586424 4   1 B (0.25000000 0.75000000)  
##            34) smoothness_mean>=-2.4008 1   0 M (1.00000000 0.00000000) *
##            35) smoothness_mean< -2.4008 3   0 B (0.00000000 1.00000000) *
##         9) texture_worst< 4.611234 49  15 B (0.30612245 0.69387755)  
##          18) smoothness_mean< -2.411844 16   4 M (0.75000000 0.25000000)  
##            36) smoothness_worst< -1.538735 10   0 M (1.00000000 0.00000000) *
##            37) smoothness_worst>=-1.538735 6   2 B (0.33333333 0.66666667)  
##              74) smoothness_mean< -2.421763 2   0 M (1.00000000 0.00000000) *
##              75) smoothness_mean>=-2.421763 4   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean>=-2.411844 33   3 B (0.09090909 0.90909091)  
##            38) texture_mean>=2.964668 4   1 M (0.75000000 0.25000000)  
##              76) smoothness_mean>=-2.394659 3   0 M (1.00000000 0.00000000) *
##              77) smoothness_mean< -2.394659 1   0 B (0.00000000 1.00000000) *
##            39) texture_mean< 2.964668 29   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.382983 542 255 B (0.47047970 0.52952030)  
##        10) compactness_se>=-4.025757 373 169 M (0.54691689 0.45308311)  
##          20) smoothness_mean>=-2.333148 292 111 M (0.61986301 0.38013699)  
##            40) symmetry_worst>=-1.839419 204  56 M (0.72549020 0.27450980)  
##              80) texture_worst>=4.508732 140  21 M (0.85000000 0.15000000) *
##              81) texture_worst< 4.508732 64  29 B (0.45312500 0.54687500) *
##            41) symmetry_worst< -1.839419 88  33 B (0.37500000 0.62500000)  
##              82) symmetry_worst< -1.925345 62  30 M (0.51612903 0.48387097) *
##              83) symmetry_worst>=-1.925345 26   1 B (0.03846154 0.96153846) *
##          21) smoothness_mean< -2.333148 81  23 B (0.28395062 0.71604938)  
##            42) symmetry_worst< -1.571577 52  23 B (0.44230769 0.55769231)  
##              84) symmetry_worst>=-1.716495 18   3 M (0.83333333 0.16666667) *
##              85) symmetry_worst< -1.716495 34   8 B (0.23529412 0.76470588) *
##            43) symmetry_worst>=-1.571577 29   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -4.025757 169  51 B (0.30177515 0.69822485)  
##          22) smoothness_mean< -2.294121 82  36 M (0.56097561 0.43902439)  
##            44) texture_worst>=4.376622 69  23 M (0.66666667 0.33333333)  
##              88) texture_worst< 4.626933 29   0 M (1.00000000 0.00000000) *
##              89) texture_worst>=4.626933 40  17 B (0.42500000 0.57500000) *
##            45) texture_worst< 4.376622 13   0 B (0.00000000 1.00000000) *
##          23) smoothness_mean>=-2.294121 87   5 B (0.05747126 0.94252874)  
##            46) smoothness_mean>=-2.21595 20   5 B (0.25000000 0.75000000)  
##              92) smoothness_mean< -2.210016 6   1 M (0.83333333 0.16666667) *
##              93) smoothness_mean>=-2.210016 14   0 B (0.00000000 1.00000000) *
##            47) smoothness_mean< -2.21595 67   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.424301 253  77 B (0.30434783 0.69565217)  
##       6) texture_mean>=2.963209 154  65 B (0.42207792 0.57792208)  
##        12) texture_mean< 3.176386 107  48 M (0.55140187 0.44859813)  
##          24) texture_mean>=3.129791 26   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 3.129791 81  33 B (0.40740741 0.59259259)  
##            50) symmetry_worst>=-1.54778 10   0 M (1.00000000 0.00000000) *
##            51) symmetry_worst< -1.54778 71  23 B (0.32394366 0.67605634)  
##             102) smoothness_mean< -2.478376 44  21 M (0.52272727 0.47727273) *
##             103) smoothness_mean>=-2.478376 27   0 B (0.00000000 1.00000000) *
##        13) texture_mean>=3.176386 47   6 B (0.12765957 0.87234043)  
##          26) symmetry_worst>=-1.530091 3   0 M (1.00000000 0.00000000) *
##          27) symmetry_worst< -1.530091 44   3 B (0.06818182 0.93181818)  
##            54) symmetry_worst< -2.188379 3   1 M (0.66666667 0.33333333)  
##             108) texture_mean>=3.330945 2   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 3.330945 1   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst>=-2.188379 41   1 B (0.02439024 0.97560976)  
##             110) smoothness_worst>=-1.490267 3   1 B (0.33333333 0.66666667) *
##             111) smoothness_worst< -1.490267 38   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 2.963209 99  12 B (0.12121212 0.87878788)  
##        14) smoothness_worst>=-1.554151 29   8 B (0.27586207 0.72413793)  
##          28) smoothness_worst< -1.551775 8   0 M (1.00000000 0.00000000) *
##          29) smoothness_worst>=-1.551775 21   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst< -1.554151 70   4 B (0.05714286 0.94285714)  
##          30) compactness_se>=-3.615179 16   4 B (0.25000000 0.75000000)  
##            60) texture_mean>=2.935975 3   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.935975 13   1 B (0.07692308 0.92307692)  
##             122) symmetry_worst< -1.998079 1   0 M (1.00000000 0.00000000) *
##             123) symmetry_worst>=-1.998079 12   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.615179 54   0 B (0.00000000 1.00000000) *
## 
## $trees[[13]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 410 B (0.44956140 0.55043860)  
##     2) smoothness_mean>=-2.423454 652 318 M (0.51226994 0.48773006)  
##       4) smoothness_mean< -2.312434 297 117 M (0.60606061 0.39393939)  
##         8) symmetry_worst< -1.608146 236  69 M (0.70762712 0.29237288)  
##          16) smoothness_worst>=-1.559798 179  38 M (0.78770950 0.21229050)  
##            32) smoothness_mean>=-2.416986 167  26 M (0.84431138 0.15568862)  
##              64) symmetry_worst>=-2.208456 163  22 M (0.86503067 0.13496933) *
##              65) symmetry_worst< -2.208456 4   0 B (0.00000000 1.00000000) *
##            33) smoothness_mean< -2.416986 12   0 B (0.00000000 1.00000000) *
##          17) smoothness_worst< -1.559798 57  26 B (0.45614035 0.54385965)  
##            34) texture_worst>=4.395741 38  12 M (0.68421053 0.31578947)  
##              68) smoothness_worst>=-1.586874 29   5 M (0.82758621 0.17241379) *
##              69) smoothness_worst< -1.586874 9   2 B (0.22222222 0.77777778) *
##            35) texture_worst< 4.395741 19   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.608146 61  13 B (0.21311475 0.78688525)  
##          18) texture_mean>=3.067477 15   5 M (0.66666667 0.33333333)  
##            36) symmetry_worst>=-1.551105 11   1 M (0.90909091 0.09090909)  
##              72) compactness_se>=-4.507761 10   0 M (1.00000000 0.00000000) *
##              73) compactness_se< -4.507761 1   0 B (0.00000000 1.00000000) *
##            37) symmetry_worst< -1.551105 4   0 B (0.00000000 1.00000000) *
##          19) texture_mean< 3.067477 46   3 B (0.06521739 0.93478261)  
##            38) symmetry_worst>=-1.431522 7   3 B (0.42857143 0.57142857)  
##              76) smoothness_mean>=-2.344658 2   0 M (1.00000000 0.00000000) *
##              77) smoothness_mean< -2.344658 5   1 B (0.20000000 0.80000000) *
##            39) symmetry_worst< -1.431522 39   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.312434 355 154 B (0.43380282 0.56619718)  
##        10) symmetry_worst>=-1.612559 127  46 M (0.63779528 0.36220472)  
##          20) smoothness_worst>=-1.500061 96  18 M (0.81250000 0.18750000)  
##            40) compactness_se>=-4.214968 92  14 M (0.84782609 0.15217391)  
##              80) texture_mean>=2.822248 70   5 M (0.92857143 0.07142857) *
##              81) texture_mean< 2.822248 22   9 M (0.59090909 0.40909091) *
##            41) compactness_se< -4.214968 4   0 B (0.00000000 1.00000000) *
##          21) smoothness_worst< -1.500061 31   3 B (0.09677419 0.90322581)  
##            42) texture_mean>=3.137421 1   0 M (1.00000000 0.00000000) *
##            43) texture_mean< 3.137421 30   2 B (0.06666667 0.93333333)  
##              86) smoothness_worst< -1.50756 7   2 B (0.28571429 0.71428571) *
##              87) smoothness_worst>=-1.50756 23   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -1.612559 228  73 B (0.32017544 0.67982456)  
##          22) smoothness_worst< -1.567043 13   0 M (1.00000000 0.00000000) *
##          23) smoothness_worst>=-1.567043 215  60 B (0.27906977 0.72093023)  
##            46) compactness_se>=-3.4389 52  20 M (0.61538462 0.38461538)  
##              92) smoothness_mean< -2.25237 25   2 M (0.92000000 0.08000000) *
##              93) smoothness_mean>=-2.25237 27   9 B (0.33333333 0.66666667) *
##            47) compactness_se< -3.4389 163  28 B (0.17177914 0.82822086)  
##              94) compactness_se>=-4.030876 113  27 B (0.23893805 0.76106195) *
##              95) compactness_se< -4.030876 50   1 B (0.02000000 0.98000000) *
##     3) smoothness_mean< -2.423454 260  76 B (0.29230769 0.70769231)  
##       6) texture_mean>=2.921008 199  74 B (0.37185930 0.62814070)  
##        12) texture_mean< 3.176386 142  68 B (0.47887324 0.52112676)  
##          24) texture_mean>=3.130673 27   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 3.130673 115  41 B (0.35652174 0.64347826)  
##            50) smoothness_worst< -1.60795 61  28 M (0.54098361 0.45901639)  
##             100) symmetry_worst>=-1.874628 41   9 M (0.78048780 0.21951220) *
##             101) symmetry_worst< -1.874628 20   1 B (0.05000000 0.95000000) *
##            51) smoothness_worst>=-1.60795 54   8 B (0.14814815 0.85185185)  
##             102) texture_mean< 2.930359 6   0 M (1.00000000 0.00000000) *
##             103) texture_mean>=2.930359 48   2 B (0.04166667 0.95833333) *
##        13) texture_mean>=3.176386 57   6 B (0.10526316 0.89473684)  
##          26) texture_mean>=3.388429 8   3 M (0.62500000 0.37500000)  
##            52) smoothness_mean>=-2.520061 5   0 M (1.00000000 0.00000000) *
##            53) smoothness_mean< -2.520061 3   0 B (0.00000000 1.00000000) *
##          27) texture_mean< 3.388429 49   1 B (0.02040816 0.97959184)  
##            54) smoothness_mean>=-2.425205 1   0 M (1.00000000 0.00000000) *
##            55) smoothness_mean< -2.425205 48   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 2.921008 61   2 B (0.03278689 0.96721311)  
##        14) texture_worst< 3.92417 4   1 B (0.25000000 0.75000000)  
##          28) texture_mean>=2.707858 1   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.707858 3   0 B (0.00000000 1.00000000) *
##        15) texture_worst>=3.92417 57   1 B (0.01754386 0.98245614)  
##          30) texture_worst< 4.400796 22   1 B (0.04545455 0.95454545)  
##            60) texture_worst>=4.389974 1   0 M (1.00000000 0.00000000) *
##            61) texture_worst< 4.389974 21   0 B (0.00000000 1.00000000) *
##          31) texture_worst>=4.400796 35   0 B (0.00000000 1.00000000) *
## 
## $trees[[14]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 436 B (0.47807018 0.52192982)  
##     2) texture_mean>=3.005187 425 172 M (0.59529412 0.40470588)  
##       4) texture_mean< 3.02965 48   3 M (0.93750000 0.06250000)  
##         8) compactness_se>=-4.262999 46   1 M (0.97826087 0.02173913)  
##          16) smoothness_mean>=-2.60159 45   0 M (1.00000000 0.00000000) *
##          17) smoothness_mean< -2.60159 1   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -4.262999 2   0 B (0.00000000 1.00000000) *
##       5) texture_mean>=3.02965 377 169 M (0.55172414 0.44827586)  
##        10) smoothness_mean>=-2.258569 81  18 M (0.77777778 0.22222222)  
##          20) smoothness_mean< -2.099273 68   5 M (0.92647059 0.07352941)  
##            40) compactness_se>=-4.045035 63   0 M (1.00000000 0.00000000) *
##            41) compactness_se< -4.045035 5   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean>=-2.099273 13   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.258569 296 145 B (0.48986486 0.51013514)  
##          22) compactness_se>=-3.477231 92  25 M (0.72826087 0.27173913)  
##            44) smoothness_worst< -1.468038 73   7 M (0.90410959 0.09589041)  
##              88) texture_mean>=3.038537 71   5 M (0.92957746 0.07042254) *
##              89) texture_mean< 3.038537 2   0 B (0.00000000 1.00000000) *
##            45) smoothness_worst>=-1.468038 19   1 B (0.05263158 0.94736842)  
##              90) smoothness_mean>=-2.2833 1   0 M (1.00000000 0.00000000) *
##              91) smoothness_mean< -2.2833 18   0 B (0.00000000 1.00000000) *
##          23) compactness_se< -3.477231 204  78 B (0.38235294 0.61764706)  
##            46) compactness_se< -3.872601 117  54 M (0.53846154 0.46153846)  
##              92) smoothness_mean< -2.291157 96  36 M (0.62500000 0.37500000) *
##              93) smoothness_mean>=-2.291157 21   3 B (0.14285714 0.85714286) *
##            47) compactness_se>=-3.872601 87  15 B (0.17241379 0.82758621)  
##              94) smoothness_worst>=-1.442386 6   0 M (1.00000000 0.00000000) *
##              95) smoothness_worst< -1.442386 81   9 B (0.11111111 0.88888889) *
##     3) texture_mean< 3.005187 487 183 B (0.37577002 0.62422998)  
##       6) smoothness_worst>=-1.451541 87  32 M (0.63218391 0.36781609)  
##        12) compactness_se>=-4.04059 65  15 M (0.76923077 0.23076923)  
##          24) smoothness_worst< -1.349735 54   6 M (0.88888889 0.11111111)  
##            48) texture_mean>=2.780541 49   3 M (0.93877551 0.06122449)  
##              96) texture_worst< 4.783684 41   0 M (1.00000000 0.00000000) *
##              97) texture_worst>=4.783684 8   3 M (0.62500000 0.37500000) *
##            49) texture_mean< 2.780541 5   2 B (0.40000000 0.60000000)  
##              98) smoothness_mean>=-2.150667 2   0 M (1.00000000 0.00000000) *
##              99) smoothness_mean< -2.150667 3   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst>=-1.349735 11   2 B (0.18181818 0.81818182)  
##            50) symmetry_worst>=-1.232339 2   0 M (1.00000000 0.00000000) *
##            51) symmetry_worst< -1.232339 9   0 B (0.00000000 1.00000000) *
##        13) compactness_se< -4.04059 22   5 B (0.22727273 0.77272727)  
##          26) symmetry_worst< -1.750302 5   0 M (1.00000000 0.00000000) *
##          27) symmetry_worst>=-1.750302 17   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.451541 400 128 B (0.32000000 0.68000000)  
##        14) texture_worst>=4.83005 10   1 M (0.90000000 0.10000000)  
##          28) texture_mean>=2.915217 9   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.915217 1   0 B (0.00000000 1.00000000) *
##        15) texture_worst< 4.83005 390 119 B (0.30512821 0.69487179)  
##          30) smoothness_worst< -1.478565 314 110 B (0.35031847 0.64968153)  
##            60) smoothness_worst>=-1.482701 24   3 M (0.87500000 0.12500000)  
##             120) smoothness_mean< -2.253991 17   0 M (1.00000000 0.00000000) *
##             121) smoothness_mean>=-2.253991 7   3 M (0.57142857 0.42857143) *
##            61) smoothness_worst< -1.482701 290  89 B (0.30689655 0.69310345)  
##             122) symmetry_worst< -1.692331 183  74 B (0.40437158 0.59562842) *
##             123) symmetry_worst>=-1.692331 107  15 B (0.14018692 0.85981308) *
##          31) smoothness_worst>=-1.478565 76   9 B (0.11842105 0.88157895)  
##            62) texture_mean>=2.99172 4   0 M (1.00000000 0.00000000) *
##            63) texture_mean< 2.99172 72   5 B (0.06944444 0.93055556)  
##             126) compactness_se>=-3.453499 14   4 B (0.28571429 0.71428571) *
##             127) compactness_se< -3.453499 58   1 B (0.01724138 0.98275862) *
## 
## $trees[[15]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 429 B (0.47039474 0.52960526)  
##     2) smoothness_mean>=-2.489159 806 400 M (0.50372208 0.49627792)  
##       4) symmetry_worst< -2.384404 25   3 M (0.88000000 0.12000000)  
##         8) texture_mean>=2.861235 22   0 M (1.00000000 0.00000000) *
##         9) texture_mean< 2.861235 3   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst>=-2.384404 781 384 B (0.49167734 0.50832266)  
##        10) symmetry_worst>=-2.232873 745 363 M (0.51275168 0.48724832)  
##          20) smoothness_mean< -2.473552 21   2 M (0.90476190 0.09523810)  
##            40) texture_mean>=2.967697 19   0 M (1.00000000 0.00000000) *
##            41) texture_mean< 2.967697 2   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean>=-2.473552 724 361 M (0.50138122 0.49861878)  
##            42) smoothness_mean>=-2.425205 646 305 M (0.52786378 0.47213622)  
##              84) texture_worst>=4.896309 113  35 M (0.69026549 0.30973451) *
##              85) texture_worst< 4.896309 533 263 B (0.49343340 0.50656660) *
##            43) smoothness_mean< -2.425205 78  22 B (0.28205128 0.71794872)  
##              86) smoothness_mean< -2.441446 49  22 B (0.44897959 0.55102041) *
##              87) smoothness_mean>=-2.441446 29   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -2.232873 36   2 B (0.05555556 0.94444444)  
##          22) smoothness_mean< -2.453321 3   1 M (0.66666667 0.33333333)  
##            44) texture_mean>=3.037949 2   0 M (1.00000000 0.00000000) *
##            45) texture_mean< 3.037949 1   0 B (0.00000000 1.00000000) *
##          23) smoothness_mean>=-2.453321 33   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.489159 106  23 B (0.21698113 0.78301887)  
##       6) symmetry_worst>=-1.667161 27  13 M (0.51851852 0.48148148)  
##        12) smoothness_worst< -1.616835 15   1 M (0.93333333 0.06666667)  
##          24) texture_mean< 3.135016 14   0 M (1.00000000 0.00000000) *
##          25) texture_mean>=3.135016 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst>=-1.616835 12   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.667161 79   9 B (0.11392405 0.88607595)  
##        14) compactness_se>=-3.613485 21   7 B (0.33333333 0.66666667)  
##          28) smoothness_mean>=-2.508983 3   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean< -2.508983 18   4 B (0.22222222 0.77777778)  
##            58) texture_mean>=3.076827 5   1 M (0.80000000 0.20000000)  
##             116) texture_mean< 3.103494 4   0 M (1.00000000 0.00000000) *
##             117) texture_mean>=3.103494 1   0 B (0.00000000 1.00000000) *
##            59) texture_mean< 3.076827 13   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -3.613485 58   2 B (0.03448276 0.96551724)  
##          30) compactness_se< -4.692873 11   2 B (0.18181818 0.81818182)  
##            60) compactness_se>=-4.711555 2   0 M (1.00000000 0.00000000) *
##            61) compactness_se< -4.711555 9   0 B (0.00000000 1.00000000) *
##          31) compactness_se>=-4.692873 47   0 B (0.00000000 1.00000000) *
## 
## $trees[[16]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 399 M (0.56250000 0.43750000)  
##    2) compactness_se>=-4.691273 882 370 M (0.58049887 0.41950113)  
##      4) smoothness_mean>=-2.332942 448 150 M (0.66517857 0.33482143)  
##        8) compactness_se>=-4.032549 358  96 M (0.73184358 0.26815642)  
##         16) symmetry_worst>=-2.188127 337  80 M (0.76261128 0.23738872)  
##           32) smoothness_mean< -2.106197 314  64 M (0.79617834 0.20382166)  
##             64) texture_worst>=4.895983 58   0 M (1.00000000 0.00000000) *
##             65) texture_worst< 4.895983 256  64 M (0.75000000 0.25000000) *
##           33) smoothness_mean>=-2.106197 23   7 B (0.30434783 0.69565217)  
##             66) symmetry_worst>=-1.596878 8   1 M (0.87500000 0.12500000) *
##             67) symmetry_worst< -1.596878 15   0 B (0.00000000 1.00000000) *
##         17) symmetry_worst< -2.188127 21   5 B (0.23809524 0.76190476)  
##           34) smoothness_mean>=-2.244441 5   0 M (1.00000000 0.00000000) *
##           35) smoothness_mean< -2.244441 16   0 B (0.00000000 1.00000000) *
##        9) compactness_se< -4.032549 90  36 B (0.40000000 0.60000000)  
##         18) smoothness_mean< -2.291157 40   6 M (0.85000000 0.15000000)  
##           36) texture_mean>=2.834088 37   3 M (0.91891892 0.08108108)  
##             72) compactness_se< -4.098353 36   2 M (0.94444444 0.05555556) *
##             73) compactness_se>=-4.098353 1   0 B (0.00000000 1.00000000) *
##           37) texture_mean< 2.834088 3   0 B (0.00000000 1.00000000) *
##         19) smoothness_mean>=-2.291157 50   2 B (0.04000000 0.96000000)  
##           38) smoothness_mean>=-2.21595 12   2 B (0.16666667 0.83333333)  
##             76) compactness_se< -4.208747 2   0 M (1.00000000 0.00000000) *
##             77) compactness_se>=-4.208747 10   0 B (0.00000000 1.00000000) *
##           39) smoothness_mean< -2.21595 38   0 B (0.00000000 1.00000000) *
##      5) smoothness_mean< -2.332942 434 214 B (0.49308756 0.50691244)  
##       10) texture_mean< 3.227241 376 171 M (0.54521277 0.45478723)  
##         20) compactness_se< -3.426516 304 118 M (0.61184211 0.38815789)  
##           40) texture_mean>=2.874407 247  79 M (0.68016194 0.31983806)  
##             80) compactness_se< -4.039628 134  23 M (0.82835821 0.17164179) *
##             81) compactness_se>=-4.039628 113  56 M (0.50442478 0.49557522) *
##           41) texture_mean< 2.874407 57  18 B (0.31578947 0.68421053)  
##             82) smoothness_worst>=-1.454595 8   1 M (0.87500000 0.12500000) *
##             83) smoothness_worst< -1.454595 49  11 B (0.22448980 0.77551020) *
##         21) compactness_se>=-3.426516 72  19 B (0.26388889 0.73611111)  
##           42) texture_mean>=3.06339 26  10 M (0.61538462 0.38461538)  
##             84) symmetry_worst>=-2.189138 20   4 M (0.80000000 0.20000000) *
##             85) symmetry_worst< -2.189138 6   0 B (0.00000000 1.00000000) *
##           43) texture_mean< 3.06339 46   3 B (0.06521739 0.93478261)  
##             86) compactness_se< -3.392487 6   2 B (0.33333333 0.66666667) *
##             87) compactness_se>=-3.392487 40   1 B (0.02500000 0.97500000) *
##       11) texture_mean>=3.227241 58   9 B (0.15517241 0.84482759)  
##         22) compactness_se>=-3.482708 6   2 M (0.66666667 0.33333333)  
##           44) texture_mean>=3.256167 4   0 M (1.00000000 0.00000000) *
##           45) texture_mean< 3.256167 2   0 B (0.00000000 1.00000000) *
##         23) compactness_se< -3.482708 52   5 B (0.09615385 0.90384615)  
##           46) texture_mean>=3.431382 2   0 M (1.00000000 0.00000000) *
##           47) texture_mean< 3.431382 50   3 B (0.06000000 0.94000000)  
##             94) texture_mean>=3.388429 6   2 B (0.33333333 0.66666667) *
##             95) texture_mean< 3.388429 44   1 B (0.02272727 0.97727273) *
##    3) compactness_se< -4.691273 30   1 B (0.03333333 0.96666667)  
##      6) symmetry_worst>=-1.124659 1   0 M (1.00000000 0.00000000) *
##      7) symmetry_worst< -1.124659 29   0 B (0.00000000 1.00000000) *
## 
## $trees[[17]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 448 B (0.49122807 0.50877193)  
##     2) smoothness_mean>=-2.423737 712 329 M (0.53792135 0.46207865)  
##       4) compactness_se>=-3.011681 66  11 M (0.83333333 0.16666667)  
##         8) smoothness_worst< -1.454202 48   1 M (0.97916667 0.02083333)  
##          16) smoothness_mean>=-2.336585 43   0 M (1.00000000 0.00000000) *
##          17) smoothness_mean< -2.336585 5   1 M (0.80000000 0.20000000)  
##            34) texture_mean>=2.972459 4   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 2.972459 1   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst>=-1.454202 18   8 B (0.44444444 0.55555556)  
##          18) smoothness_mean>=-2.161865 6   0 M (1.00000000 0.00000000) *
##          19) smoothness_mean< -2.161865 12   2 B (0.16666667 0.83333333)  
##            38) symmetry_worst< -1.642275 2   0 M (1.00000000 0.00000000) *
##            39) symmetry_worst>=-1.642275 10   0 B (0.00000000 1.00000000) *
##       5) compactness_se< -3.011681 646 318 M (0.50773994 0.49226006)  
##        10) smoothness_worst>=-1.472307 191  67 M (0.64921466 0.35078534)  
##          20) smoothness_mean< -2.300091 57   5 M (0.91228070 0.08771930)  
##            40) texture_mean>=2.735974 54   2 M (0.96296296 0.03703704)  
##              80) compactness_se>=-4.497673 53   1 M (0.98113208 0.01886792) *
##              81) compactness_se< -4.497673 1   0 B (0.00000000 1.00000000) *
##            41) texture_mean< 2.735974 3   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean>=-2.300091 134  62 M (0.53731343 0.46268657)  
##            42) compactness_se>=-4.030558 98  30 M (0.69387755 0.30612245)  
##              84) texture_mean>=2.915043 60   7 M (0.88333333 0.11666667) *
##              85) texture_mean< 2.915043 38  15 B (0.39473684 0.60526316) *
##            43) compactness_se< -4.030558 36   4 B (0.11111111 0.88888889)  
##              86) symmetry_worst< -1.743442 7   3 M (0.57142857 0.42857143) *
##              87) symmetry_worst>=-1.743442 29   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.472307 455 204 B (0.44835165 0.55164835)  
##          22) symmetry_worst< -1.692015 327 150 M (0.54128440 0.45871560)  
##            44) smoothness_worst< -1.474843 302 126 M (0.58278146 0.41721854)  
##              88) texture_mean< 3.36829 288 112 M (0.61111111 0.38888889) *
##              89) texture_mean>=3.36829 14   0 B (0.00000000 1.00000000) *
##            45) smoothness_worst>=-1.474843 25   1 B (0.04000000 0.96000000)  
##              90) texture_mean>=2.978826 1   0 M (1.00000000 0.00000000) *
##              91) texture_mean< 2.978826 24   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst>=-1.692015 128  27 B (0.21093750 0.78906250)  
##            46) compactness_se>=-3.4704 20   8 M (0.60000000 0.40000000)  
##              92) smoothness_worst>=-1.51165 11   1 M (0.90909091 0.09090909) *
##              93) smoothness_worst< -1.51165 9   2 B (0.22222222 0.77777778) *
##            47) compactness_se< -3.4704 108  15 B (0.13888889 0.86111111)  
##              94) texture_worst>=4.818867 3   0 M (1.00000000 0.00000000) *
##              95) texture_worst< 4.818867 105  12 B (0.11428571 0.88571429) *
##     3) smoothness_mean< -2.423737 200  65 B (0.32500000 0.67500000)  
##       6) texture_mean< 3.176386 164  65 B (0.39634146 0.60365854)  
##        12) smoothness_worst>=-1.656234 126  63 M (0.50000000 0.50000000)  
##          24) texture_mean>=3.111958 16   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 3.111958 110  47 B (0.42727273 0.57272727)  
##            50) smoothness_worst< -1.551775 86  39 M (0.54651163 0.45348837)  
##             100) smoothness_mean< -2.432353 76  29 M (0.61842105 0.38157895) *
##             101) smoothness_mean>=-2.432353 10   0 B (0.00000000 1.00000000) *
##            51) smoothness_worst>=-1.551775 24   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.656234 38   2 B (0.05263158 0.94736842)  
##          26) smoothness_worst< -1.720903 7   2 B (0.28571429 0.71428571)  
##            52) compactness_se>=-3.013033 2   0 M (1.00000000 0.00000000) *
##            53) compactness_se< -3.013033 5   0 B (0.00000000 1.00000000) *
##          27) smoothness_worst>=-1.720903 31   0 B (0.00000000 1.00000000) *
##       7) texture_mean>=3.176386 36   0 B (0.00000000 1.00000000) *
## 
## $trees[[18]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 401 B (0.43969298 0.56030702)  
##    2) smoothness_worst>=-1.603315 769 369 B (0.47984395 0.52015605)  
##      4) texture_mean>=2.709047 735 367 B (0.49931973 0.50068027)  
##        8) compactness_se>=-3.719548 299 118 M (0.60535117 0.39464883)  
##         16) texture_mean< 2.746628 28   0 M (1.00000000 0.00000000) *
##         17) texture_mean>=2.746628 271 118 M (0.56457565 0.43542435)  
##           34) symmetry_worst>=-1.606092 73  17 M (0.76712329 0.23287671)  
##             68) smoothness_mean>=-2.298098 50   5 M (0.90000000 0.10000000) *
##             69) smoothness_mean< -2.298098 23  11 B (0.47826087 0.52173913) *
##           35) symmetry_worst< -1.606092 198  97 B (0.48989899 0.51010101)  
##             70) smoothness_mean< -2.14559 180  83 M (0.53888889 0.46111111) *
##             71) smoothness_mean>=-2.14559 18   0 B (0.00000000 1.00000000) *
##        9) compactness_se< -3.719548 436 186 B (0.42660550 0.57339450)  
##         18) compactness_se< -3.859436 366 178 B (0.48633880 0.51366120)  
##           36) smoothness_mean< -2.291157 278 122 M (0.56115108 0.43884892)  
##             72) smoothness_worst>=-1.55307 192  62 M (0.67708333 0.32291667) *
##             73) smoothness_worst< -1.55307 86  26 B (0.30232558 0.69767442) *
##           37) smoothness_mean>=-2.291157 88  22 B (0.25000000 0.75000000)  
##             74) compactness_se>=-4.032549 37  15 M (0.59459459 0.40540541) *
##             75) compactness_se< -4.032549 51   0 B (0.00000000 1.00000000) *
##         19) compactness_se>=-3.859436 70   8 B (0.11428571 0.88571429)  
##           38) smoothness_worst>=-1.472895 9   3 M (0.66666667 0.33333333)  
##             76) smoothness_mean< -2.17953 6   0 M (1.00000000 0.00000000) *
##             77) smoothness_mean>=-2.17953 3   0 B (0.00000000 1.00000000) *
##           39) smoothness_worst< -1.472895 61   2 B (0.03278689 0.96721311)  
##             78) symmetry_worst< -1.901985 8   2 B (0.25000000 0.75000000) *
##             79) symmetry_worst>=-1.901985 53   0 B (0.00000000 1.00000000) *
##      5) texture_mean< 2.709047 34   2 B (0.05882353 0.94117647)  
##       10) texture_worst< 3.80118 10   2 B (0.20000000 0.80000000)  
##         20) texture_mean>=2.630644 2   0 M (1.00000000 0.00000000) *
##         21) texture_mean< 2.630644 8   0 B (0.00000000 1.00000000) *
##       11) texture_worst>=3.80118 24   0 B (0.00000000 1.00000000) *
##    3) smoothness_worst< -1.603315 143  32 B (0.22377622 0.77622378)  
##      6) symmetry_worst>=-1.868413 63  28 B (0.44444444 0.55555556)  
##       12) texture_worst>=4.334485 50  22 M (0.56000000 0.44000000)  
##         24) texture_mean< 3.062357 33   7 M (0.78787879 0.21212121)  
##           48) texture_mean>=2.939162 28   2 M (0.92857143 0.07142857)  
##             96) compactness_se>=-4.938351 27   1 M (0.96296296 0.03703704) *
##             97) compactness_se< -4.938351 1   0 B (0.00000000 1.00000000) *
##           49) texture_mean< 2.939162 5   0 B (0.00000000 1.00000000) *
##         25) texture_mean>=3.062357 17   2 B (0.11764706 0.88235294)  
##           50) smoothness_mean>=-2.337942 2   0 M (1.00000000 0.00000000) *
##           51) smoothness_mean< -2.337942 15   0 B (0.00000000 1.00000000) *
##       13) texture_worst< 4.334485 13   0 B (0.00000000 1.00000000) *
##      7) symmetry_worst< -1.868413 80   4 B (0.05000000 0.95000000)  
##       14) smoothness_mean>=-2.373736 1   0 M (1.00000000 0.00000000) *
##       15) smoothness_mean< -2.373736 79   3 B (0.03797468 0.96202532)  
##         30) smoothness_worst< -1.718904 10   3 B (0.30000000 0.70000000)  
##           60) compactness_se>=-3.013033 3   0 M (1.00000000 0.00000000) *
##           61) compactness_se< -3.013033 7   0 B (0.00000000 1.00000000) *
##         31) smoothness_worst>=-1.718904 69   0 B (0.00000000 1.00000000) *
## 
## $trees[[19]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 353 B (0.38706140 0.61293860)  
##     2) symmetry_worst< -2.385442 27   3 M (0.88888889 0.11111111)  
##       4) texture_mean< 3.283931 26   2 M (0.92307692 0.07692308)  
##         8) smoothness_worst< -1.534654 21   0 M (1.00000000 0.00000000) *
##         9) smoothness_worst>=-1.534654 5   2 M (0.60000000 0.40000000)  
##          18) texture_mean>=3.050671 3   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 3.050671 2   0 B (0.00000000 1.00000000) *
##       5) texture_mean>=3.283931 1   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst>=-2.385442 885 329 B (0.37175141 0.62824859)  
##       6) symmetry_worst>=-1.840831 498 217 B (0.43574297 0.56425703)  
##        12) symmetry_worst< -1.750623 159  70 M (0.55974843 0.44025157)  
##          24) smoothness_worst>=-1.547262 101  30 M (0.70297030 0.29702970)  
##            48) smoothness_mean< -2.210016 85  15 M (0.82352941 0.17647059)  
##              96) compactness_se>=-4.388189 81  11 M (0.86419753 0.13580247) *
##              97) compactness_se< -4.388189 4   0 B (0.00000000 1.00000000) *
##            49) smoothness_mean>=-2.210016 16   1 B (0.06250000 0.93750000)  
##              98) compactness_se>=-3.317826 1   0 M (1.00000000 0.00000000) *
##              99) compactness_se< -3.317826 15   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst< -1.547262 58  18 B (0.31034483 0.68965517)  
##            50) texture_worst>=4.56463 12   1 M (0.91666667 0.08333333)  
##             100) texture_mean< 3.176386 11   0 M (1.00000000 0.00000000) *
##             101) texture_mean>=3.176386 1   0 B (0.00000000 1.00000000) *
##            51) texture_worst< 4.56463 46   7 B (0.15217391 0.84782609)  
##             102) smoothness_mean>=-2.302636 4   0 M (1.00000000 0.00000000) *
##             103) smoothness_mean< -2.302636 42   3 B (0.07142857 0.92857143) *
##        13) symmetry_worst>=-1.750623 339 128 B (0.37758112 0.62241888)  
##          26) smoothness_worst>=-1.473088 100  41 M (0.59000000 0.41000000)  
##            52) compactness_se< -2.961809 85  28 M (0.67058824 0.32941176)  
##             104) symmetry_worst>=-1.65458 67  14 M (0.79104478 0.20895522) *
##             105) symmetry_worst< -1.65458 18   4 B (0.22222222 0.77777778) *
##            53) compactness_se>=-2.961809 15   2 B (0.13333333 0.86666667)  
##             106) texture_mean< 2.81718 1   0 M (1.00000000 0.00000000) *
##             107) texture_mean>=2.81718 14   1 B (0.07142857 0.92857143) *
##          27) smoothness_worst< -1.473088 239  69 B (0.28870293 0.71129707)  
##            54) compactness_se>=-3.483184 52  20 M (0.61538462 0.38461538)  
##             108) texture_mean>=2.956366 37   7 M (0.81081081 0.18918919) *
##             109) texture_mean< 2.956366 15   2 B (0.13333333 0.86666667) *
##            55) compactness_se< -3.483184 187  37 B (0.19786096 0.80213904)  
##             110) compactness_se< -4.658767 5   0 M (1.00000000 0.00000000) *
##             111) compactness_se>=-4.658767 182  32 B (0.17582418 0.82417582) *
##       7) symmetry_worst< -1.840831 387 112 B (0.28940568 0.71059432)  
##        14) texture_worst>=4.907333 79  34 M (0.56962025 0.43037975)  
##          28) symmetry_worst>=-2.207988 54  12 M (0.77777778 0.22222222)  
##            56) smoothness_mean>=-2.427815 34   0 M (1.00000000 0.00000000) *
##            57) smoothness_mean< -2.427815 20   8 B (0.40000000 0.60000000)  
##             114) texture_worst< 4.987149 8   0 M (1.00000000 0.00000000) *
##             115) texture_worst>=4.987149 12   0 B (0.00000000 1.00000000) *
##          29) symmetry_worst< -2.207988 25   3 B (0.12000000 0.88000000)  
##            58) compactness_se>=-3.413706 3   0 M (1.00000000 0.00000000) *
##            59) compactness_se< -3.413706 22   0 B (0.00000000 1.00000000) *
##        15) texture_worst< 4.907333 308  67 B (0.21753247 0.78246753)  
##          30) compactness_se>=-3.02233 20   7 M (0.65000000 0.35000000)  
##            60) compactness_se< -2.870592 12   0 M (1.00000000 0.00000000) *
##            61) compactness_se>=-2.870592 8   1 B (0.12500000 0.87500000)  
##             122) texture_mean>=3.109826 1   0 M (1.00000000 0.00000000) *
##             123) texture_mean< 3.109826 7   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.02233 288  54 B (0.18750000 0.81250000)  
##            62) smoothness_mean< -2.35264 140  44 B (0.31428571 0.68571429)  
##             124) smoothness_mean>=-2.394871 44  13 M (0.70454545 0.29545455) *
##             125) smoothness_mean< -2.394871 96  13 B (0.13541667 0.86458333) *
##            63) smoothness_mean>=-2.35264 148  10 B (0.06756757 0.93243243)  
##             126) smoothness_worst>=-1.49704 73  10 B (0.13698630 0.86301370) *
##             127) smoothness_worst< -1.49704 75   0 B (0.00000000 1.00000000) *
## 
## $trees[[20]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 403 B (0.44188596 0.55811404)  
##     2) symmetry_worst>=-1.353976 49  10 M (0.79591837 0.20408163)  
##       4) texture_mean>=2.644674 46   7 M (0.84782609 0.15217391)  
##         8) compactness_se< -2.588521 41   3 M (0.92682927 0.07317073)  
##          16) texture_mean< 3.104075 34   0 M (1.00000000 0.00000000) *
##          17) texture_mean>=3.104075 7   3 M (0.57142857 0.42857143)  
##            34) texture_mean>=3.116842 4   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.116842 3   0 B (0.00000000 1.00000000) *
##         9) compactness_se>=-2.588521 5   1 B (0.20000000 0.80000000)  
##          18) texture_mean>=2.915767 1   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 2.915767 4   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.644674 3   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.353976 863 364 B (0.42178447 0.57821553)  
##       6) compactness_se< -3.355844 757 343 B (0.45310436 0.54689564)  
##        12) compactness_se>=-3.721197 239  94 M (0.60669456 0.39330544)  
##          24) texture_worst< 4.616724 154  41 M (0.73376623 0.26623377)  
##            48) smoothness_mean>=-2.322902 77   6 M (0.92207792 0.07792208)  
##              96) texture_worst>=4.050785 65   1 M (0.98461538 0.01538462) *
##              97) texture_worst< 4.050785 12   5 M (0.58333333 0.41666667) *
##            49) smoothness_mean< -2.322902 77  35 M (0.54545455 0.45454545)  
##              98) texture_worst>=4.56463 20   0 M (1.00000000 0.00000000) *
##              99) texture_worst< 4.56463 57  22 B (0.38596491 0.61403509) *
##          25) texture_worst>=4.616724 85  32 B (0.37647059 0.62352941)  
##            50) smoothness_mean>=-2.286719 19   4 M (0.78947368 0.21052632)  
##             100) smoothness_mean< -2.119611 15   0 M (1.00000000 0.00000000) *
##             101) smoothness_mean>=-2.119611 4   0 B (0.00000000 1.00000000) *
##            51) smoothness_mean< -2.286719 66  17 B (0.25757576 0.74242424)  
##             102) smoothness_mean< -2.349943 30  15 M (0.50000000 0.50000000) *
##             103) smoothness_mean>=-2.349943 36   2 B (0.05555556 0.94444444) *
##        13) compactness_se< -3.721197 518 198 B (0.38223938 0.61776062)  
##          26) compactness_se>=-4.705565 488 198 B (0.40573770 0.59426230)  
##            52) compactness_se< -4.116284 249 118 M (0.52610442 0.47389558)  
##             104) texture_mean< 3.032503 168  61 M (0.63690476 0.36309524) *
##             105) texture_mean>=3.032503 81  24 B (0.29629630 0.70370370) *
##            53) compactness_se>=-4.116284 239  67 B (0.28033473 0.71966527)  
##             106) texture_worst>=4.895983 46  17 M (0.63043478 0.36956522) *
##             107) texture_worst< 4.895983 193  38 B (0.19689119 0.80310881) *
##          27) compactness_se< -4.705565 30   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-3.355844 106  21 B (0.19811321 0.80188679)  
##        14) texture_mean>=3.038537 33  16 B (0.48484848 0.51515152)  
##          28) texture_mean< 3.216873 20   6 M (0.70000000 0.30000000)  
##            56) smoothness_mean< -2.154474 17   3 M (0.82352941 0.17647059)  
##             112) symmetry_worst>=-2.143533 13   0 M (1.00000000 0.00000000) *
##             113) symmetry_worst< -2.143533 4   1 B (0.25000000 0.75000000) *
##            57) smoothness_mean>=-2.154474 3   0 B (0.00000000 1.00000000) *
##          29) texture_mean>=3.216873 13   2 B (0.15384615 0.84615385)  
##            58) texture_mean>=3.252756 2   0 M (1.00000000 0.00000000) *
##            59) texture_mean< 3.252756 11   0 B (0.00000000 1.00000000) *
##        15) texture_mean< 3.038537 73   5 B (0.06849315 0.93150685)  
##          30) smoothness_mean>=-2.082188 3   0 M (1.00000000 0.00000000) *
##          31) smoothness_mean< -2.082188 70   2 B (0.02857143 0.97142857)  
##            62) smoothness_worst>=-1.481325 15   2 B (0.13333333 0.86666667)  
##             124) texture_mean>=3.031099 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 3.031099 14   1 B (0.07142857 0.92857143) *
##            63) smoothness_worst< -1.481325 55   0 B (0.00000000 1.00000000) *
## 
## $trees[[21]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 387 B (0.42434211 0.57565789)  
##     2) symmetry_worst>=-1.366937 66  10 M (0.84848485 0.15151515)  
##       4) smoothness_mean< -2.036051 64   8 M (0.87500000 0.12500000)  
##         8) compactness_se< -2.588521 61   6 M (0.90163934 0.09836066)  
##          16) texture_mean< 3.104075 50   2 M (0.96000000 0.04000000)  
##            32) smoothness_mean>=-2.352085 34   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean< -2.352085 16   2 M (0.87500000 0.12500000)  
##              66) texture_mean>=2.89093 14   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 2.89093 2   0 B (0.00000000 1.00000000) *
##          17) texture_mean>=3.104075 11   4 M (0.63636364 0.36363636)  
##            34) texture_mean>=3.116842 7   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.116842 4   0 B (0.00000000 1.00000000) *
##         9) compactness_se>=-2.588521 3   1 B (0.33333333 0.66666667)  
##          18) texture_mean>=2.996569 1   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 2.996569 2   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.036051 2   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.366937 846 331 B (0.39125296 0.60874704)  
##       6) texture_mean>=3.388429 17   2 M (0.88235294 0.11764706)  
##        12) compactness_se>=-4.317414 15   0 M (1.00000000 0.00000000) *
##        13) compactness_se< -4.317414 2   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 3.388429 829 316 B (0.38118215 0.61881785)  
##        14) smoothness_mean>=-2.423454 627 265 B (0.42264753 0.57735247)  
##          28) symmetry_worst>=-1.786753 307 153 M (0.50162866 0.49837134)  
##            56) symmetry_worst< -1.781339 23   0 M (1.00000000 0.00000000) *
##            57) symmetry_worst>=-1.781339 284 131 B (0.46126761 0.53873239)  
##             114) compactness_se>=-3.703794 110  42 M (0.61818182 0.38181818) *
##             115) compactness_se< -3.703794 174  63 B (0.36206897 0.63793103) *
##          29) symmetry_worst< -1.786753 320 111 B (0.34687500 0.65312500)  
##            58) symmetry_worst< -1.938643 177  78 B (0.44067797 0.55932203)  
##             116) symmetry_worst>=-1.964873 37   6 M (0.83783784 0.16216216) *
##             117) symmetry_worst< -1.964873 140  47 B (0.33571429 0.66428571) *
##            59) symmetry_worst>=-1.938643 143  33 B (0.23076923 0.76923077)  
##             118) texture_mean< 2.724206 14   4 M (0.71428571 0.28571429) *
##             119) texture_mean>=2.724206 129  23 B (0.17829457 0.82170543) *
##        15) smoothness_mean< -2.423454 202  51 B (0.25247525 0.74752475)  
##          30) symmetry_worst>=-1.541072 25   8 M (0.68000000 0.32000000)  
##            60) smoothness_mean< -2.431217 21   4 M (0.80952381 0.19047619)  
##             120) texture_mean>=2.973641 10   0 M (1.00000000 0.00000000) *
##             121) texture_mean< 2.973641 11   4 M (0.63636364 0.36363636) *
##            61) smoothness_mean>=-2.431217 4   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst< -1.541072 177  34 B (0.19209040 0.80790960)  
##            62) smoothness_mean< -2.467991 111  30 B (0.27027027 0.72972973)  
##             124) smoothness_mean>=-2.468227 7   0 M (1.00000000 0.00000000) *
##             125) smoothness_mean< -2.468227 104  23 B (0.22115385 0.77884615) *
##            63) smoothness_mean>=-2.467991 66   4 B (0.06060606 0.93939394)  
##             126) symmetry_worst< -2.004084 19   4 B (0.21052632 0.78947368) *
##             127) symmetry_worst>=-2.004084 47   0 B (0.00000000 1.00000000) *
## 
## $trees[[22]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 434 B (0.47587719 0.52412281)  
##    2) texture_mean>=2.708379 867 426 B (0.49134948 0.50865052)  
##      4) smoothness_worst>=-1.501069 363 155 M (0.57300275 0.42699725)  
##        8) smoothness_worst< -1.476605 142  32 M (0.77464789 0.22535211)  
##         16) smoothness_worst>=-1.482699 75   6 M (0.92000000 0.08000000)  
##           32) texture_worst>=4.126187 73   4 M (0.94520548 0.05479452)  
##             64) texture_worst< 4.635614 63   0 M (1.00000000 0.00000000) *
##             65) texture_worst>=4.635614 10   4 M (0.60000000 0.40000000) *
##           33) texture_worst< 4.126187 2   0 B (0.00000000 1.00000000) *
##         17) smoothness_worst< -1.482699 67  26 M (0.61194030 0.38805970)  
##           34) smoothness_worst< -1.484675 57  16 M (0.71929825 0.28070175)  
##             68) texture_worst>=4.484566 48   8 M (0.83333333 0.16666667) *
##             69) texture_worst< 4.484566 9   1 B (0.11111111 0.88888889) *
##           35) smoothness_worst>=-1.484675 10   0 B (0.00000000 1.00000000) *
##        9) smoothness_worst>=-1.476605 221  98 B (0.44343891 0.55656109)  
##         18) smoothness_worst>=-1.473476 192  95 B (0.49479167 0.50520833)  
##           36) compactness_se>=-4.032549 128  51 M (0.60156250 0.39843750)  
##             72) compactness_se< -3.532908 60   8 M (0.86666667 0.13333333) *
##             73) compactness_se>=-3.532908 68  25 B (0.36764706 0.63235294) *
##           37) compactness_se< -4.032549 64  18 B (0.28125000 0.71875000)  
##             74) texture_mean>=3.07984 14   5 M (0.64285714 0.35714286) *
##             75) texture_mean< 3.07984 50   9 B (0.18000000 0.82000000) *
##         19) smoothness_worst< -1.473476 29   3 B (0.10344828 0.89655172)  
##           38) texture_mean>=3.069079 3   0 M (1.00000000 0.00000000) *
##           39) texture_mean< 3.069079 26   0 B (0.00000000 1.00000000) *
##      5) smoothness_worst< -1.501069 504 218 B (0.43253968 0.56746032)  
##       10) smoothness_worst< -1.519464 451 210 B (0.46563193 0.53436807)  
##         20) smoothness_worst>=-1.520292 20   0 M (1.00000000 0.00000000) *
##         21) smoothness_worst< -1.520292 431 190 B (0.44083527 0.55916473)  
##           42) smoothness_worst< -1.533868 375 180 B (0.48000000 0.52000000)  
##             84) texture_worst>=5.093455 43  10 M (0.76744186 0.23255814) *
##             85) texture_worst< 5.093455 332 147 B (0.44277108 0.55722892) *
##           43) smoothness_worst>=-1.533868 56  10 B (0.17857143 0.82142857)  
##             86) texture_mean>=3.065024 20   9 B (0.45000000 0.55000000) *
##             87) texture_mean< 3.065024 36   1 B (0.02777778 0.97222222) *
##       11) smoothness_worst>=-1.519464 53   8 B (0.15094340 0.84905660)  
##         22) texture_mean>=3.006423 25   8 B (0.32000000 0.68000000)  
##           44) symmetry_worst< -1.551134 9   2 M (0.77777778 0.22222222)  
##             88) texture_worst>=4.577291 7   0 M (1.00000000 0.00000000) *
##             89) texture_worst< 4.577291 2   0 B (0.00000000 1.00000000) *
##           45) symmetry_worst>=-1.551134 16   1 B (0.06250000 0.93750000)  
##             90) smoothness_mean>=-2.263106 1   0 M (1.00000000 0.00000000) *
##             91) smoothness_mean< -2.263106 15   0 B (0.00000000 1.00000000) *
##         23) texture_mean< 3.006423 28   0 B (0.00000000 1.00000000) *
##    3) texture_mean< 2.708379 45   8 B (0.17777778 0.82222222)  
##      6) texture_mean< 2.479051 6   1 M (0.83333333 0.16666667)  
##       12) smoothness_mean>=-2.170026 5   0 M (1.00000000 0.00000000) *
##       13) smoothness_mean< -2.170026 1   0 B (0.00000000 1.00000000) *
##      7) texture_mean>=2.479051 39   3 B (0.07692308 0.92307692)  
##       14) symmetry_worst>=-1.112025 2   0 M (1.00000000 0.00000000) *
##       15) symmetry_worst< -1.112025 37   1 B (0.02702703 0.97297297)  
##         30) texture_mean>=2.648549 10   1 B (0.10000000 0.90000000)  
##           60) texture_mean< 2.666527 1   0 M (1.00000000 0.00000000) *
##           61) texture_mean>=2.666527 9   0 B (0.00000000 1.00000000) *
##         31) texture_mean< 2.648549 27   0 B (0.00000000 1.00000000) *
## 
## $trees[[23]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 426 M (0.53289474 0.46710526)  
##     2) symmetry_worst>=-1.293329 35   3 M (0.91428571 0.08571429)  
##       4) smoothness_worst>=-1.49848 29   1 M (0.96551724 0.03448276)  
##         8) smoothness_mean>=-2.340816 28   0 M (1.00000000 0.00000000) *
##         9) smoothness_mean< -2.340816 1   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.49848 6   2 M (0.66666667 0.33333333)  
##        10) smoothness_mean< -2.349786 4   0 M (1.00000000 0.00000000) *
##        11) smoothness_mean>=-2.349786 2   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.293329 877 423 M (0.51767389 0.48232611)  
##       6) texture_mean>=2.653549 846 396 M (0.53191489 0.46808511)  
##        12) texture_mean< 3.227241 785 354 M (0.54904459 0.45095541)  
##          24) texture_worst>=5.032208 40   5 M (0.87500000 0.12500000)  
##            48) texture_worst< 5.280287 35   0 M (1.00000000 0.00000000) *
##            49) texture_worst>=5.280287 5   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 5.032208 745 349 M (0.53154362 0.46845638)  
##            50) smoothness_worst< -1.403628 702 316 M (0.54985755 0.45014245)  
##             100) compactness_se< -3.392487 619 262 M (0.57673667 0.42326333) *
##             101) compactness_se>=-3.392487 83  29 B (0.34939759 0.65060241) *
##            51) smoothness_worst>=-1.403628 43  10 B (0.23255814 0.76744186)  
##             102) compactness_se>=-3.217781 12   3 M (0.75000000 0.25000000) *
##             103) compactness_se< -3.217781 31   1 B (0.03225806 0.96774194) *
##        13) texture_mean>=3.227241 61  19 B (0.31147541 0.68852459)  
##          26) smoothness_worst< -1.582589 12   2 M (0.83333333 0.16666667)  
##            52) texture_mean>=3.331484 10   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 3.331484 2   0 B (0.00000000 1.00000000) *
##          27) smoothness_worst>=-1.582589 49   9 B (0.18367347 0.81632653)  
##            54) compactness_se>=-3.575987 7   2 M (0.71428571 0.28571429)  
##             108) compactness_se< -2.831802 5   0 M (1.00000000 0.00000000) *
##             109) compactness_se>=-2.831802 2   0 B (0.00000000 1.00000000) *
##            55) compactness_se< -3.575987 42   4 B (0.09523810 0.90476190)  
##             110) smoothness_mean>=-2.306298 2   0 M (1.00000000 0.00000000) *
##             111) smoothness_mean< -2.306298 40   2 B (0.05000000 0.95000000) *
##       7) texture_mean< 2.653549 31   4 B (0.12903226 0.87096774)  
##        14) smoothness_mean>=-2.07745 7   3 M (0.57142857 0.42857143)  
##          28) texture_mean< 2.515298 4   0 M (1.00000000 0.00000000) *
##          29) texture_mean>=2.515298 3   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.07745 24   0 B (0.00000000 1.00000000) *
## 
## $trees[[24]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 428 B (0.46929825 0.53070175)  
##     2) symmetry_worst>=-1.348749 42   6 M (0.85714286 0.14285714)  
##       4) texture_mean>=2.756192 36   2 M (0.94444444 0.05555556)  
##         8) texture_worst>=4.373597 26   0 M (1.00000000 0.00000000) *
##         9) texture_worst< 4.373597 10   2 M (0.80000000 0.20000000)  
##          18) texture_worst< 4.279513 8   0 M (1.00000000 0.00000000) *
##          19) texture_worst>=4.279513 2   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.756192 6   2 B (0.33333333 0.66666667)  
##        10) compactness_se>=-3.3026 2   0 M (1.00000000 0.00000000) *
##        11) compactness_se< -3.3026 4   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.348749 870 392 B (0.45057471 0.54942529)  
##       6) compactness_se>=-4.705732 850 392 B (0.46117647 0.53882353)  
##        12) texture_worst< 4.54138 369 172 M (0.53387534 0.46612466)  
##          24) texture_worst>=4.523593 41   6 M (0.85365854 0.14634146)  
##            48) smoothness_mean< -2.234468 31   0 M (1.00000000 0.00000000) *
##            49) smoothness_mean>=-2.234468 10   4 B (0.40000000 0.60000000)  
##              98) texture_mean>=3.023554 4   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 3.023554 6   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 4.523593 328 162 B (0.49390244 0.50609756)  
##            50) smoothness_worst< -1.541278 152  57 M (0.62500000 0.37500000)  
##             100) compactness_se>=-4.501722 126  36 M (0.71428571 0.28571429) *
##             101) compactness_se< -4.501722 26   5 B (0.19230769 0.80769231) *
##            51) smoothness_worst>=-1.541278 176  67 B (0.38068182 0.61931818)  
##             102) smoothness_worst>=-1.483493 102  47 M (0.53921569 0.46078431) *
##             103) smoothness_worst< -1.483493 74  12 B (0.16216216 0.83783784) *
##        13) texture_worst>=4.54138 481 195 B (0.40540541 0.59459459)  
##          26) compactness_se>=-3.334337 56  16 M (0.71428571 0.28571429)  
##            52) smoothness_mean>=-2.41714 37   4 M (0.89189189 0.10810811)  
##             104) texture_mean>=3.03709 31   0 M (1.00000000 0.00000000) *
##             105) texture_mean< 3.03709 6   2 B (0.33333333 0.66666667) *
##            53) smoothness_mean< -2.41714 19   7 B (0.36842105 0.63157895)  
##             106) compactness_se< -3.106177 7   0 M (1.00000000 0.00000000) *
##             107) compactness_se>=-3.106177 12   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -3.334337 425 155 B (0.36470588 0.63529412)  
##            54) smoothness_mean< -2.351049 235 111 B (0.47234043 0.52765957)  
##             108) smoothness_mean>=-2.367605 23   0 M (1.00000000 0.00000000) *
##             109) smoothness_mean< -2.367605 212  88 B (0.41509434 0.58490566) *
##            55) smoothness_mean>=-2.351049 190  44 B (0.23157895 0.76842105)  
##             110) texture_worst>=5.277564 9   0 M (1.00000000 0.00000000) *
##             111) texture_worst< 5.277564 181  35 B (0.19337017 0.80662983) *
##       7) compactness_se< -4.705732 20   0 B (0.00000000 1.00000000) *
## 
## $trees[[25]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 M (0.51535088 0.48464912)  
##     2) texture_mean>=2.892591 632 263 M (0.58386076 0.41613924)  
##       4) compactness_se< -4.094455 216  63 M (0.70833333 0.29166667)  
##         8) smoothness_mean>=-2.426508 128  23 M (0.82031250 0.17968750)  
##          16) smoothness_mean< -2.295141 113  12 M (0.89380531 0.10619469)  
##            32) texture_mean< 3.227241 106   7 M (0.93396226 0.06603774)  
##              64) texture_worst>=4.35485 104   5 M (0.95192308 0.04807692) *
##              65) texture_worst< 4.35485 2   0 B (0.00000000 1.00000000) *
##            33) texture_mean>=3.227241 7   2 B (0.28571429 0.71428571)  
##              66) compactness_se>=-4.299856 2   0 M (1.00000000 0.00000000) *
##              67) compactness_se< -4.299856 5   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean>=-2.295141 15   4 B (0.26666667 0.73333333)  
##            34) smoothness_mean>=-2.222419 4   0 M (1.00000000 0.00000000) *
##            35) smoothness_mean< -2.222419 11   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.426508 88  40 M (0.54545455 0.45454545)  
##          18) symmetry_worst>=-1.705164 59  17 M (0.71186441 0.28813559)  
##            36) smoothness_worst< -1.567962 37   2 M (0.94594595 0.05405405)  
##              72) texture_mean>=2.958609 35   0 M (1.00000000 0.00000000) *
##              73) texture_mean< 2.958609 2   0 B (0.00000000 1.00000000) *
##            37) smoothness_worst>=-1.567962 22   7 B (0.31818182 0.68181818)  
##              74) texture_mean< 2.936149 7   0 M (1.00000000 0.00000000) *
##              75) texture_mean>=2.936149 15   0 B (0.00000000 1.00000000) *
##          19) symmetry_worst< -1.705164 29   6 B (0.20689655 0.79310345)  
##            38) texture_worst>=4.89091 12   6 M (0.50000000 0.50000000)  
##              76) texture_worst< 4.984007 6   0 M (1.00000000 0.00000000) *
##              77) texture_worst>=4.984007 6   0 B (0.00000000 1.00000000) *
##            39) texture_worst< 4.89091 17   0 B (0.00000000 1.00000000) *
##       5) compactness_se>=-4.094455 416 200 M (0.51923077 0.48076923)  
##        10) smoothness_mean>=-2.28279 148  39 M (0.73648649 0.26351351)  
##          20) compactness_se>=-4.030876 139  30 M (0.78417266 0.21582734)  
##            40) texture_mean>=2.912343 131  24 M (0.81679389 0.18320611)  
##              80) smoothness_worst>=-1.500666 86   7 M (0.91860465 0.08139535) *
##              81) smoothness_worst< -1.500666 45  17 M (0.62222222 0.37777778) *
##            41) texture_mean< 2.912343 8   2 B (0.25000000 0.75000000)  
##              82) smoothness_mean< -2.17953 2   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean>=-2.17953 6   0 B (0.00000000 1.00000000) *
##          21) compactness_se< -4.030876 9   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.28279 268 107 B (0.39925373 0.60074627)  
##          22) smoothness_worst< -1.509803 176  88 M (0.50000000 0.50000000)  
##            44) texture_worst>=4.415916 161  73 M (0.54658385 0.45341615)  
##              88) symmetry_worst>=-1.801537 69  18 M (0.73913043 0.26086957) *
##              89) symmetry_worst< -1.801537 92  37 B (0.40217391 0.59782609) *
##            45) texture_worst< 4.415916 15   0 B (0.00000000 1.00000000) *
##          23) smoothness_worst>=-1.509803 92  19 B (0.20652174 0.79347826)  
##            46) texture_worst>=4.890484 26  12 M (0.53846154 0.46153846)  
##              92) smoothness_worst>=-1.48132 13   1 M (0.92307692 0.07692308) *
##              93) smoothness_worst< -1.48132 13   2 B (0.15384615 0.84615385) *
##            47) texture_worst< 4.890484 66   5 B (0.07575758 0.92424242)  
##              94) smoothness_mean< -2.37669 7   3 M (0.57142857 0.42857143) *
##              95) smoothness_mean>=-2.37669 59   1 B (0.01694915 0.98305085) *
##     3) texture_mean< 2.892591 280 101 B (0.36071429 0.63928571)  
##       6) compactness_se>=-4.198706 215  92 B (0.42790698 0.57209302)  
##        12) compactness_se< -3.427747 173  86 M (0.50289017 0.49710983)  
##          24) compactness_se>=-3.894783 93  34 M (0.63440860 0.36559140)  
##            48) texture_worst>=4.250385 32   4 M (0.87500000 0.12500000)  
##              96) smoothness_mean>=-2.358315 28   0 M (1.00000000 0.00000000) *
##              97) smoothness_mean< -2.358315 4   0 B (0.00000000 1.00000000) *
##            49) texture_worst< 4.250385 61  30 M (0.50819672 0.49180328)  
##              98) texture_worst< 3.895613 26   4 M (0.84615385 0.15384615) *
##              99) texture_worst>=3.895613 35   9 B (0.25714286 0.74285714) *
##          25) compactness_se< -3.894783 80  28 B (0.35000000 0.65000000)  
##            50) compactness_se< -4.160164 22   3 M (0.86363636 0.13636364)  
##             100) texture_mean>=2.779034 19   0 M (1.00000000 0.00000000) *
##             101) texture_mean< 2.779034 3   0 B (0.00000000 1.00000000) *
##            51) compactness_se>=-4.160164 58   9 B (0.15517241 0.84482759)  
##             102) smoothness_worst>=-1.451541 11   2 M (0.81818182 0.18181818) *
##             103) smoothness_worst< -1.451541 47   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-3.427747 42   5 B (0.11904762 0.88095238)  
##          26) smoothness_mean>=-2.069166 4   1 M (0.75000000 0.25000000)  
##            52) texture_mean>=2.65428 3   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 2.65428 1   0 B (0.00000000 1.00000000) *
##          27) smoothness_mean< -2.069166 38   2 B (0.05263158 0.94736842)  
##            54) symmetry_worst>=-1.316602 3   1 M (0.66666667 0.33333333)  
##             108) texture_mean< 2.830318 2   0 M (1.00000000 0.00000000) *
##             109) texture_mean>=2.830318 1   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst< -1.316602 35   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -4.198706 65   9 B (0.13846154 0.86153846)  
##        14) texture_mean>=2.87384 10   4 M (0.60000000 0.40000000)  
##          28) texture_mean< 2.884497 6   0 M (1.00000000 0.00000000) *
##          29) texture_mean>=2.884497 4   0 B (0.00000000 1.00000000) *
##        15) texture_mean< 2.87384 55   3 B (0.05454545 0.94545455)  
##          30) texture_worst>=4.600092 13   3 B (0.23076923 0.76923077)  
##            60) smoothness_mean< -2.330887 3   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean>=-2.330887 10   0 B (0.00000000 1.00000000) *
##          31) texture_worst< 4.600092 42   0 B (0.00000000 1.00000000) *
## 
## $trees[[26]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 436 B (0.47807018 0.52192982)  
##     2) texture_mean>=2.960364 515 227 M (0.55922330 0.44077670)  
##       4) texture_worst>=4.354728 499 211 M (0.57715431 0.42284569)  
##         8) texture_worst< 4.644679 159  43 M (0.72955975 0.27044025)  
##          16) smoothness_worst>=-1.498254 48   2 M (0.95833333 0.04166667)  
##            32) compactness_se>=-4.35833 47   1 M (0.97872340 0.02127660)  
##              64) symmetry_worst>=-1.833099 42   0 M (1.00000000 0.00000000) *
##              65) symmetry_worst< -1.833099 5   1 M (0.80000000 0.20000000) *
##            33) compactness_se< -4.35833 1   0 B (0.00000000 1.00000000) *
##          17) smoothness_worst< -1.498254 111  41 M (0.63063063 0.36936937)  
##            34) smoothness_mean< -2.229408 100  30 M (0.70000000 0.30000000)  
##              68) compactness_se< -2.82386 91  21 M (0.76923077 0.23076923) *
##              69) compactness_se>=-2.82386 9   0 B (0.00000000 1.00000000) *
##            35) smoothness_mean>=-2.229408 11   0 B (0.00000000 1.00000000) *
##         9) texture_worst>=4.644679 340 168 M (0.50588235 0.49411765)  
##          18) texture_worst>=5.016194 100  27 M (0.73000000 0.27000000)  
##            36) symmetry_worst>=-2.063111 77   9 M (0.88311688 0.11688312)  
##              72) smoothness_mean>=-2.58821 76   8 M (0.89473684 0.10526316) *
##              73) smoothness_mean< -2.58821 1   0 B (0.00000000 1.00000000) *
##            37) symmetry_worst< -2.063111 23   5 B (0.21739130 0.78260870)  
##              74) compactness_se>=-3.413706 5   0 M (1.00000000 0.00000000) *
##              75) compactness_se< -3.413706 18   0 B (0.00000000 1.00000000) *
##          19) texture_worst< 5.016194 240  99 B (0.41250000 0.58750000)  
##            38) compactness_se< -4.436859 32   7 M (0.78125000 0.21875000)  
##              76) compactness_se>=-4.590265 25   1 M (0.96000000 0.04000000) *
##              77) compactness_se< -4.590265 7   1 B (0.14285714 0.85714286) *
##            39) compactness_se>=-4.436859 208  74 B (0.35576923 0.64423077)  
##              78) symmetry_worst< -2.000522 50  18 M (0.64000000 0.36000000) *
##              79) symmetry_worst>=-2.000522 158  42 B (0.26582278 0.73417722) *
##       5) texture_worst< 4.354728 16   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.960364 397 148 B (0.37279597 0.62720403)  
##       6) texture_mean< 2.948902 363 148 B (0.40771350 0.59228650)  
##        12) texture_mean>=2.927988 56  17 M (0.69642857 0.30357143)  
##          24) compactness_se>=-4.177518 51  12 M (0.76470588 0.23529412)  
##            48) texture_mean< 2.938103 23   0 M (1.00000000 0.00000000) *
##            49) texture_mean>=2.938103 28  12 M (0.57142857 0.42857143)  
##              98) texture_mean>=2.947329 12   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 2.947329 16   4 B (0.25000000 0.75000000) *
##          25) compactness_se< -4.177518 5   0 B (0.00000000 1.00000000) *
##        13) texture_mean< 2.927988 307 109 B (0.35504886 0.64495114)  
##          26) smoothness_worst< -1.539792 95  47 M (0.50526316 0.49473684)  
##            52) smoothness_worst>=-1.547262 18   0 M (1.00000000 0.00000000) *
##            53) smoothness_worst< -1.547262 77  30 B (0.38961039 0.61038961)  
##             106) smoothness_mean< -2.436819 41  18 M (0.56097561 0.43902439) *
##             107) smoothness_mean>=-2.436819 36   7 B (0.19444444 0.80555556) *
##          27) smoothness_worst>=-1.539792 212  61 B (0.28773585 0.71226415)  
##            54) compactness_se>=-4.198706 166  61 B (0.36746988 0.63253012)  
##             108) texture_mean< 2.893423 137  60 B (0.43795620 0.56204380) *
##             109) texture_mean>=2.893423 29   1 B (0.03448276 0.96551724) *
##            55) compactness_se< -4.198706 46   0 B (0.00000000 1.00000000) *
##       7) texture_mean>=2.948902 34   0 B (0.00000000 1.00000000) *
## 
## $trees[[27]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 422 M (0.53728070 0.46271930)  
##    2) texture_mean>=2.709047 873 389 M (0.55441008 0.44558992)  
##      4) compactness_se>=-3.721197 350 125 M (0.64285714 0.35714286)  
##        8) compactness_se< -3.57681 68   6 M (0.91176471 0.08823529)  
##         16) smoothness_mean>=-2.509667 66   4 M (0.93939394 0.06060606)  
##           32) texture_worst< 4.855419 54   0 M (1.00000000 0.00000000) *
##           33) texture_worst>=4.855419 12   4 M (0.66666667 0.33333333)  
##             66) texture_mean>=3.340739 8   0 M (1.00000000 0.00000000) *
##             67) texture_mean< 3.340739 4   0 B (0.00000000 1.00000000) *
##         17) smoothness_mean< -2.509667 2   0 B (0.00000000 1.00000000) *
##        9) compactness_se>=-3.57681 282 119 M (0.57801418 0.42198582)  
##         18) smoothness_worst>=-1.618016 251  94 M (0.62549801 0.37450199)  
##           36) texture_mean>=3.058688 84  16 M (0.80952381 0.19047619)  
##             72) smoothness_worst< -1.468038 68   5 M (0.92647059 0.07352941) *
##             73) smoothness_worst>=-1.468038 16   5 B (0.31250000 0.68750000) *
##           37) texture_mean< 3.058688 167  78 M (0.53293413 0.46706587)  
##             74) texture_mean< 3.001714 111  32 M (0.71171171 0.28828829) *
##             75) texture_mean>=3.001714 56  10 B (0.17857143 0.82142857) *
##         19) smoothness_worst< -1.618016 31   6 B (0.19354839 0.80645161)  
##           38) smoothness_worst< -1.694287 5   0 M (1.00000000 0.00000000) *
##           39) smoothness_worst>=-1.694287 26   1 B (0.03846154 0.96153846)  
##             78) texture_mean>=3.166164 1   0 M (1.00000000 0.00000000) *
##             79) texture_mean< 3.166164 25   0 B (0.00000000 1.00000000) *
##      5) compactness_se< -3.721197 523 259 B (0.49521989 0.50478011)  
##       10) smoothness_worst>=-1.48191 161  57 M (0.64596273 0.35403727)  
##         20) compactness_se>=-4.510489 148  44 M (0.70270270 0.29729730)  
##           40) smoothness_mean< -2.235394 122  27 M (0.77868852 0.22131148)  
##             80) texture_worst< 4.550742 66   4 M (0.93939394 0.06060606) *
##             81) texture_worst>=4.550742 56  23 M (0.58928571 0.41071429) *
##           41) smoothness_mean>=-2.235394 26   9 B (0.34615385 0.65384615)  
##             82) texture_mean>=2.939917 12   3 M (0.75000000 0.25000000) *
##             83) texture_mean< 2.939917 14   0 B (0.00000000 1.00000000) *
##         21) compactness_se< -4.510489 13   0 B (0.00000000 1.00000000) *
##       11) smoothness_worst< -1.48191 362 155 B (0.42817680 0.57182320)  
##         22) compactness_se< -3.869459 313 155 B (0.49520767 0.50479233)  
##           44) smoothness_mean< -2.294121 287 132 M (0.54006969 0.45993031)  
##             88) texture_mean>=2.892314 224  83 M (0.62946429 0.37053571) *
##             89) texture_mean< 2.892314 63  14 B (0.22222222 0.77777778) *
##           45) smoothness_mean>=-2.294121 26   0 B (0.00000000 1.00000000) *
##         23) compactness_se>=-3.869459 49   0 B (0.00000000 1.00000000) *
##    3) texture_mean< 2.709047 39   6 B (0.15384615 0.84615385)  
##      6) texture_worst< 3.858337 14   6 B (0.42857143 0.57142857)  
##       12) compactness_se>=-3.808227 8   2 M (0.75000000 0.25000000)  
##         24) smoothness_mean< -1.942706 6   0 M (1.00000000 0.00000000) *
##         25) smoothness_mean>=-1.942706 2   0 B (0.00000000 1.00000000) *
##       13) compactness_se< -3.808227 6   0 B (0.00000000 1.00000000) *
##      7) texture_worst>=3.858337 25   0 B (0.00000000 1.00000000) *
## 
## $trees[[28]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 445 B (0.48793860 0.51206140)  
##     2) texture_mean>=3.058002 289 116 M (0.59861592 0.40138408)  
##       4) smoothness_worst>=-1.603555 234  75 M (0.67948718 0.32051282)  
##         8) texture_worst< 4.797934 78   5 M (0.93589744 0.06410256)  
##          16) texture_worst>=4.496164 74   1 M (0.98648649 0.01351351)  
##            32) smoothness_worst< -1.469988 73   0 M (1.00000000 0.00000000) *
##            33) smoothness_worst>=-1.469988 1   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 4.496164 4   0 B (0.00000000 1.00000000) *
##         9) texture_worst>=4.797934 156  70 M (0.55128205 0.44871795)  
##          18) smoothness_mean>=-2.501158 141  55 M (0.60992908 0.39007092)  
##            36) texture_worst>=4.897936 116  36 M (0.68965517 0.31034483)  
##              72) texture_mean>=3.082368 108  29 M (0.73148148 0.26851852) *
##              73) texture_mean< 3.082368 8   1 B (0.12500000 0.87500000) *
##            37) texture_worst< 4.897936 25   6 B (0.24000000 0.76000000)  
##              74) texture_mean< 3.09883 6   1 M (0.83333333 0.16666667) *
##              75) texture_mean>=3.09883 19   1 B (0.05263158 0.94736842) *
##          19) smoothness_mean< -2.501158 15   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.603555 55  14 B (0.25454545 0.74545455)  
##        10) compactness_se>=-3.013033 9   1 M (0.88888889 0.11111111)  
##          20) texture_mean>=3.076827 8   0 M (1.00000000 0.00000000) *
##          21) texture_mean< 3.076827 1   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -3.013033 46   6 B (0.13043478 0.86956522)  
##          22) compactness_se< -4.507137 10   4 M (0.60000000 0.40000000)  
##            44) compactness_se>=-4.572499 6   0 M (1.00000000 0.00000000) *
##            45) compactness_se< -4.572499 4   0 B (0.00000000 1.00000000) *
##          23) compactness_se>=-4.507137 36   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 3.058002 623 272 B (0.43659711 0.56340289)  
##       6) symmetry_worst>=-1.325507 30   4 M (0.86666667 0.13333333)  
##        12) compactness_se< -2.588521 27   2 M (0.92592593 0.07407407)  
##          24) smoothness_mean< -2.022167 26   1 M (0.96153846 0.03846154)  
##            48) smoothness_mean>=-2.340816 24   0 M (1.00000000 0.00000000) *
##            49) smoothness_mean< -2.340816 2   1 M (0.50000000 0.50000000)  
##              98) texture_mean>=2.868073 1   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 2.868073 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean>=-2.022167 1   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-2.588521 3   1 B (0.33333333 0.66666667)  
##          26) texture_mean>=2.915767 1   0 M (1.00000000 0.00000000) *
##          27) texture_mean< 2.915767 2   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.325507 593 246 B (0.41483980 0.58516020)  
##        14) compactness_se< -3.476676 462 212 B (0.45887446 0.54112554)  
##          28) compactness_se>=-3.883198 167  63 M (0.62275449 0.37724551)  
##            56) smoothness_mean>=-2.321775 89  20 M (0.77528090 0.22471910)  
##             112) smoothness_worst>=-1.456304 30   1 M (0.96666667 0.03333333) *
##             113) smoothness_worst< -1.456304 59  19 M (0.67796610 0.32203390) *
##            57) smoothness_mean< -2.321775 78  35 B (0.44871795 0.55128205)  
##             114) smoothness_worst< -1.598495 23   2 M (0.91304348 0.08695652) *
##             115) smoothness_worst>=-1.598495 55  14 B (0.25454545 0.74545455) *
##          29) compactness_se< -3.883198 295 108 B (0.36610169 0.63389831)  
##            58) texture_mean>=2.809391 261 108 B (0.41379310 0.58620690)  
##             116) texture_mean< 2.848102 34   9 M (0.73529412 0.26470588) *
##             117) texture_mean>=2.848102 227  83 B (0.36563877 0.63436123) *
##            59) texture_mean< 2.809391 34   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-3.476676 131  34 B (0.25954198 0.74045802)  
##          30) smoothness_worst>=-1.502084 56  27 B (0.48214286 0.51785714)  
##            60) smoothness_worst< -1.468619 25   4 M (0.84000000 0.16000000)  
##             120) texture_mean>=2.8622 18   0 M (1.00000000 0.00000000) *
##             121) texture_mean< 2.8622 7   3 B (0.42857143 0.57142857) *
##            61) smoothness_worst>=-1.468619 31   6 B (0.19354839 0.80645161)  
##             122) smoothness_mean>=-2.049356 3   0 M (1.00000000 0.00000000) *
##             123) smoothness_mean< -2.049356 28   3 B (0.10714286 0.89285714) *
##          31) smoothness_worst< -1.502084 75   7 B (0.09333333 0.90666667)  
##            62) texture_mean>=3.038537 2   0 M (1.00000000 0.00000000) *
##            63) texture_mean< 3.038537 73   5 B (0.06849315 0.93150685)  
##             126) smoothness_worst< -1.568787 19   5 B (0.26315789 0.73684211) *
##             127) smoothness_worst>=-1.568787 54   0 B (0.00000000 1.00000000) *
## 
## $trees[[29]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 435 B (0.47697368 0.52302632)  
##     2) texture_worst>=4.362076 703 335 M (0.52347084 0.47652916)  
##       4) texture_mean< 3.087399 504 212 M (0.57936508 0.42063492)  
##         8) smoothness_mean>=-2.199595 40   5 M (0.87500000 0.12500000)  
##          16) texture_mean>=2.918531 29   0 M (1.00000000 0.00000000) *
##          17) texture_mean< 2.918531 11   5 M (0.54545455 0.45454545)  
##            34) texture_mean< 2.899496 6   0 M (1.00000000 0.00000000) *
##            35) texture_mean>=2.899496 5   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.199595 464 207 M (0.55387931 0.44612069)  
##          18) smoothness_mean< -2.235862 408 165 M (0.59558824 0.40441176)  
##            36) smoothness_mean>=-2.391854 230  71 M (0.69130435 0.30869565)  
##              72) smoothness_worst< -1.476997 139  23 M (0.83453237 0.16546763) *
##              73) smoothness_worst>=-1.476997 91  43 B (0.47252747 0.52747253) *
##            37) smoothness_mean< -2.391854 178  84 B (0.47191011 0.52808989)  
##              74) smoothness_worst< -1.549836 137  56 M (0.59124088 0.40875912) *
##              75) smoothness_worst>=-1.549836 41   3 B (0.07317073 0.92682927) *
##          19) smoothness_mean>=-2.235862 56  14 B (0.25000000 0.75000000)  
##            38) compactness_se< -4.140724 12   2 M (0.83333333 0.16666667)  
##              76) smoothness_mean>=-2.21595 10   0 M (1.00000000 0.00000000) *
##              77) smoothness_mean< -2.21595 2   0 B (0.00000000 1.00000000) *
##            39) compactness_se>=-4.140724 44   4 B (0.09090909 0.90909091)  
##              78) texture_mean>=3.04949 4   0 M (1.00000000 0.00000000) *
##              79) texture_mean< 3.04949 40   0 B (0.00000000 1.00000000) *
##       5) texture_mean>=3.087399 199  76 B (0.38190955 0.61809045)  
##        10) smoothness_worst>=-1.603555 162  73 B (0.45061728 0.54938272)  
##          20) texture_worst< 4.803681 31   6 M (0.80645161 0.19354839)  
##            40) smoothness_mean< -2.29363 24   0 M (1.00000000 0.00000000) *
##            41) smoothness_mean>=-2.29363 7   1 B (0.14285714 0.85714286)  
##              82) smoothness_mean>=-2.242961 1   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean< -2.242961 6   0 B (0.00000000 1.00000000) *
##          21) texture_worst>=4.803681 131  48 B (0.36641221 0.63358779)  
##            42) smoothness_worst< -1.582589 7   0 M (1.00000000 0.00000000) *
##            43) smoothness_worst>=-1.582589 124  41 B (0.33064516 0.66935484)  
##              86) texture_mean< 3.171358 31  14 M (0.54838710 0.45161290) *
##              87) texture_mean>=3.171358 93  24 B (0.25806452 0.74193548) *
##        11) smoothness_worst< -1.603555 37   3 B (0.08108108 0.91891892)  
##          22) smoothness_mean>=-2.337942 2   0 M (1.00000000 0.00000000) *
##          23) smoothness_mean< -2.337942 35   1 B (0.02857143 0.97142857)  
##            46) symmetry_worst< -2.632248 1   0 M (1.00000000 0.00000000) *
##            47) symmetry_worst>=-2.632248 34   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.362076 209  67 B (0.32057416 0.67942584)  
##       6) compactness_se>=-4.173143 175  67 B (0.38285714 0.61714286)  
##        12) compactness_se< -4.160164 8   0 M (1.00000000 0.00000000) *
##        13) compactness_se>=-4.160164 167  59 B (0.35329341 0.64670659)  
##          26) texture_mean< 2.801532 105  49 B (0.46666667 0.53333333)  
##            52) texture_worst>=4.178472 27   4 M (0.85185185 0.14814815)  
##             104) compactness_se>=-3.892047 23   0 M (1.00000000 0.00000000) *
##             105) compactness_se< -3.892047 4   0 B (0.00000000 1.00000000) *
##            53) texture_worst< 4.178472 78  26 B (0.33333333 0.66666667)  
##             106) compactness_se>=-2.94014 7   1 M (0.85714286 0.14285714) *
##             107) compactness_se< -2.94014 71  20 B (0.28169014 0.71830986) *
##          27) texture_mean>=2.801532 62  10 B (0.16129032 0.83870968)  
##            54) texture_worst< 4.138009 4   0 M (1.00000000 0.00000000) *
##            55) texture_worst>=4.138009 58   6 B (0.10344828 0.89655172)  
##             110) smoothness_worst>=-1.384694 2   0 M (1.00000000 0.00000000) *
##             111) smoothness_worst< -1.384694 56   4 B (0.07142857 0.92857143) *
##       7) compactness_se< -4.173143 34   0 B (0.00000000 1.00000000) *
## 
## $trees[[30]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 440 B (0.48245614 0.51754386)  
##     2) smoothness_mean>=-2.424301 673 308 M (0.54234770 0.45765230)  
##       4) symmetry_worst< -2.379234 29   2 M (0.93103448 0.06896552)  
##         8) texture_mean>=2.855865 28   1 M (0.96428571 0.03571429)  
##          16) smoothness_mean< -2.287736 25   0 M (1.00000000 0.00000000) *
##          17) smoothness_mean>=-2.287736 3   1 M (0.66666667 0.33333333)  
##            34) texture_mean>=3.050671 2   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.050671 1   0 B (0.00000000 1.00000000) *
##         9) texture_mean< 2.855865 1   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst>=-2.379234 644 306 M (0.52484472 0.47515528)  
##        10) smoothness_worst>=-1.559144 554 240 M (0.56678700 0.43321300)  
##          20) smoothness_worst< -1.536824 76  10 M (0.86842105 0.13157895)  
##            40) symmetry_worst< -1.583647 71   6 M (0.91549296 0.08450704)  
##              80) texture_mean>=2.6809 69   4 M (0.94202899 0.05797101) *
##              81) texture_mean< 2.6809 2   0 B (0.00000000 1.00000000) *
##            41) symmetry_worst>=-1.583647 5   1 B (0.20000000 0.80000000)  
##              82) texture_mean>=3.07122 1   0 M (1.00000000 0.00000000) *
##              83) texture_mean< 3.07122 4   0 B (0.00000000 1.00000000) *
##          21) smoothness_worst>=-1.536824 478 230 M (0.51882845 0.48117155)  
##            42) smoothness_worst>=-1.525694 440 198 M (0.55000000 0.45000000)  
##              84) compactness_se>=-4.557422 427 185 M (0.56674473 0.43325527) *
##              85) compactness_se< -4.557422 13   0 B (0.00000000 1.00000000) *
##            43) smoothness_worst< -1.525694 38   6 B (0.15789474 0.84210526)  
##              86) smoothness_mean>=-2.170258 6   1 M (0.83333333 0.16666667) *
##              87) smoothness_mean< -2.170258 32   1 B (0.03125000 0.96875000) *
##        11) smoothness_worst< -1.559144 90  24 B (0.26666667 0.73333333)  
##          22) compactness_se>=-3.745127 46  23 M (0.50000000 0.50000000)  
##            44) texture_worst>=4.585652 22   4 M (0.81818182 0.18181818)  
##              88) smoothness_worst>=-1.618016 19   1 M (0.94736842 0.05263158) *
##              89) smoothness_worst< -1.618016 3   0 B (0.00000000 1.00000000) *
##            45) texture_worst< 4.585652 24   5 B (0.20833333 0.79166667)  
##              90) texture_mean>=2.969227 3   0 M (1.00000000 0.00000000) *
##              91) texture_mean< 2.969227 21   2 B (0.09523810 0.90476190) *
##          23) compactness_se< -3.745127 44   1 B (0.02272727 0.97727273)  
##            46) smoothness_mean< -2.399947 3   1 B (0.33333333 0.66666667)  
##              92) texture_mean< 3.119032 1   0 M (1.00000000 0.00000000) *
##              93) texture_mean>=3.119032 2   0 B (0.00000000 1.00000000) *
##            47) smoothness_mean>=-2.399947 41   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.424301 239  75 B (0.31380753 0.68619247)  
##       6) smoothness_worst< -1.551775 182  70 B (0.38461538 0.61538462)  
##        12) smoothness_worst>=-1.556752 26   3 M (0.88461538 0.11538462)  
##          24) texture_mean>=2.850634 23   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 2.850634 3   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.556752 156  47 B (0.30128205 0.69871795)  
##          26) smoothness_worst< -1.576547 123  47 B (0.38211382 0.61788618)  
##            52) symmetry_worst>=-1.781697 42  13 M (0.69047619 0.30952381)  
##             104) smoothness_mean< -2.47008 34   6 M (0.82352941 0.17647059) *
##             105) smoothness_mean>=-2.47008 8   1 B (0.12500000 0.87500000) *
##            53) symmetry_worst< -1.781697 81  18 B (0.22222222 0.77777778)  
##             106) compactness_se>=-3.514597 23   9 M (0.60869565 0.39130435) *
##             107) compactness_se< -3.514597 58   4 B (0.06896552 0.93103448) *
##          27) smoothness_worst>=-1.576547 33   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst>=-1.551775 57   5 B (0.08771930 0.91228070)  
##        14) symmetry_worst< -1.893206 5   0 M (1.00000000 0.00000000) *
##        15) symmetry_worst>=-1.893206 52   0 B (0.00000000 1.00000000) *
## 
## $trees[[31]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 381 B (0.41776316 0.58223684)  
##     2) symmetry_worst>=-1.366937 34   8 M (0.76470588 0.23529412)  
##       4) compactness_se>=-3.486288 17   0 M (1.00000000 0.00000000) *
##       5) compactness_se< -3.486288 17   8 M (0.52941176 0.47058824)  
##        10) compactness_se< -4.00428 7   0 M (1.00000000 0.00000000) *
##        11) compactness_se>=-4.00428 10   2 B (0.20000000 0.80000000)  
##          22) smoothness_mean< -2.419235 2   0 M (1.00000000 0.00000000) *
##          23) smoothness_mean>=-2.419235 8   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.366937 878 355 B (0.40432802 0.59567198)  
##       6) symmetry_worst< -1.56292 731 318 B (0.43502052 0.56497948)  
##        12) smoothness_worst>=-1.637109 684 312 B (0.45614035 0.54385965)  
##          24) smoothness_mean< -2.299091 435 209 M (0.51954023 0.48045977)  
##            48) texture_worst< 4.465917 120  32 M (0.73333333 0.26666667)  
##              96) compactness_se< -3.377574 110  22 M (0.80000000 0.20000000) *
##              97) compactness_se>=-3.377574 10   0 B (0.00000000 1.00000000) *
##            49) texture_worst>=4.465917 315 138 B (0.43809524 0.56190476)  
##              98) texture_worst>=4.578048 240 120 M (0.50000000 0.50000000) *
##              99) texture_worst< 4.578048 75  18 B (0.24000000 0.76000000) *
##          25) smoothness_mean>=-2.299091 249  86 B (0.34538153 0.65461847)  
##            50) compactness_se>=-3.02233 14   1 M (0.92857143 0.07142857)  
##             100) texture_mean>=2.81216 13   0 M (1.00000000 0.00000000) *
##             101) texture_mean< 2.81216 1   0 B (0.00000000 1.00000000) *
##            51) compactness_se< -3.02233 235  73 B (0.31063830 0.68936170)  
##             102) symmetry_worst>=-1.65118 37  12 M (0.67567568 0.32432432) *
##             103) symmetry_worst< -1.65118 198  48 B (0.24242424 0.75757576) *
##        13) smoothness_worst< -1.637109 47   6 B (0.12765957 0.87234043)  
##          26) compactness_se>=-2.979429 7   3 M (0.57142857 0.42857143)  
##            52) texture_mean>=3.076827 4   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 3.076827 3   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -2.979429 40   2 B (0.05000000 0.95000000)  
##            54) smoothness_mean>=-2.38784 1   0 M (1.00000000 0.00000000) *
##            55) smoothness_mean< -2.38784 39   1 B (0.02564103 0.97435897)  
##             110) symmetry_worst>=-1.800994 4   1 B (0.25000000 0.75000000) *
##             111) symmetry_worst< -1.800994 35   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst>=-1.56292 147  37 B (0.25170068 0.74829932)  
##        14) smoothness_mean>=-2.155028 8   0 M (1.00000000 0.00000000) *
##        15) smoothness_mean< -2.155028 139  29 B (0.20863309 0.79136691)  
##          30) texture_worst>=5.204837 6   0 M (1.00000000 0.00000000) *
##          31) texture_worst< 5.204837 133  23 B (0.17293233 0.82706767)  
##            62) smoothness_mean< -2.454281 15   7 B (0.46666667 0.53333333)  
##             124) smoothness_mean>=-2.462871 7   0 M (1.00000000 0.00000000) *
##             125) smoothness_mean< -2.462871 8   0 B (0.00000000 1.00000000) *
##            63) smoothness_mean>=-2.454281 118  16 B (0.13559322 0.86440678)  
##             126) texture_mean>=3.01402 31  10 B (0.32258065 0.67741935) *
##             127) texture_mean< 3.01402 87   6 B (0.06896552 0.93103448) *
## 
## $trees[[32]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 385 B (0.42214912 0.57785088)  
##     2) symmetry_worst< -2.49184 28   5 M (0.82142857 0.17857143)  
##       4) smoothness_worst< -1.534654 25   2 M (0.92000000 0.08000000)  
##         8) texture_mean>=2.861235 23   0 M (1.00000000 0.00000000) *
##         9) texture_mean< 2.861235 2   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.534654 3   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst>=-2.49184 884 362 B (0.40950226 0.59049774)  
##       6) smoothness_worst>=-1.558926 661 302 B (0.45688351 0.54311649)  
##        12) smoothness_mean>=-2.48495 631 302 B (0.47860539 0.52139461)  
##          24) smoothness_worst< -1.541278 87  21 M (0.75862069 0.24137931)  
##            48) smoothness_mean< -2.313857 81  15 M (0.81481481 0.18518519)  
##              96) symmetry_worst< -1.809006 55   1 M (0.98181818 0.01818182) *
##              97) symmetry_worst>=-1.809006 26  12 B (0.46153846 0.53846154) *
##            49) smoothness_mean>=-2.313857 6   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst>=-1.541278 544 236 B (0.43382353 0.56617647)  
##            50) symmetry_worst>=-2.178473 512 235 B (0.45898438 0.54101562)  
##             100) texture_mean>=2.836149 404 196 M (0.51485149 0.48514851) *
##             101) texture_mean< 2.836149 108  27 B (0.25000000 0.75000000) *
##            51) symmetry_worst< -2.178473 32   1 B (0.03125000 0.96875000)  
##             102) texture_worst>=4.947241 6   1 B (0.16666667 0.83333333) *
##             103) texture_worst< 4.947241 26   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.48495 30   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.558926 223  60 B (0.26905830 0.73094170)  
##        14) smoothness_worst< -1.57166 165  56 B (0.33939394 0.66060606)  
##          28) symmetry_worst>=-1.787851 56  25 M (0.55357143 0.44642857)  
##            56) texture_mean>=2.933058 47  16 M (0.65957447 0.34042553)  
##             112) texture_worst< 4.733599 30   5 M (0.83333333 0.16666667) *
##             113) texture_worst>=4.733599 17   6 B (0.35294118 0.64705882) *
##            57) texture_mean< 2.933058 9   0 B (0.00000000 1.00000000) *
##          29) symmetry_worst< -1.787851 109  25 B (0.22935780 0.77064220)  
##            58) texture_worst>=4.898911 22  10 M (0.54545455 0.45454545)  
##             116) texture_worst< 5.13268 10   0 M (1.00000000 0.00000000) *
##             117) texture_worst>=5.13268 12   2 B (0.16666667 0.83333333) *
##            59) texture_worst< 4.898911 87  13 B (0.14942529 0.85057471)  
##             118) smoothness_worst< -1.709736 6   1 M (0.83333333 0.16666667) *
##             119) smoothness_worst>=-1.709736 81   8 B (0.09876543 0.90123457) *
##        15) smoothness_worst>=-1.57166 58   4 B (0.06896552 0.93103448)  
##          30) compactness_se>=-2.682598 2   0 M (1.00000000 0.00000000) *
##          31) compactness_se< -2.682598 56   2 B (0.03571429 0.96428571)  
##            62) smoothness_mean>=-2.299648 1   0 M (1.00000000 0.00000000) *
##            63) smoothness_mean< -2.299648 55   1 B (0.01818182 0.98181818)  
##             126) smoothness_worst< -1.568787 7   1 B (0.14285714 0.85714286) *
##             127) smoothness_worst>=-1.568787 48   0 B (0.00000000 1.00000000) *
## 
## $trees[[33]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 382 B (0.41885965 0.58114035)  
##     2) symmetry_worst>=-1.353976 49  13 M (0.73469388 0.26530612)  
##       4) symmetry_worst< -1.244631 26   2 M (0.92307692 0.07692308)  
##         8) compactness_se>=-3.807179 23   0 M (1.00000000 0.00000000) *
##         9) compactness_se< -3.807179 3   1 B (0.33333333 0.66666667)  
##          18) texture_mean>=2.933998 1   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 2.933998 2   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst>=-1.244631 23  11 M (0.52173913 0.47826087)  
##        10) symmetry_worst>=-1.072749 12   0 M (1.00000000 0.00000000) *
##        11) symmetry_worst< -1.072749 11   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.353976 863 346 B (0.40092700 0.59907300)  
##       6) smoothness_mean>=-2.425205 656 291 B (0.44359756 0.55640244)  
##        12) smoothness_mean< -2.233059 531 260 B (0.48964218 0.51035782)  
##          24) texture_worst< 4.555292 249  97 M (0.61044177 0.38955823)  
##            48) symmetry_worst< -1.700875 185  55 M (0.70270270 0.29729730)  
##              96) compactness_se< -3.48728 138  28 M (0.79710145 0.20289855) *
##              97) compactness_se>=-3.48728 47  20 B (0.42553191 0.57446809) *
##            49) symmetry_worst>=-1.700875 64  22 B (0.34375000 0.65625000)  
##              98) texture_worst>=4.517878 9   0 M (1.00000000 0.00000000) *
##              99) texture_worst< 4.517878 55  13 B (0.23636364 0.76363636) *
##          25) texture_worst>=4.555292 282 108 B (0.38297872 0.61702128)  
##            50) smoothness_worst>=-1.419909 15   0 M (1.00000000 0.00000000) *
##            51) smoothness_worst< -1.419909 267  93 B (0.34831461 0.65168539)  
##             102) smoothness_worst< -1.484675 146  67 B (0.45890411 0.54109589) *
##             103) smoothness_worst>=-1.484675 121  26 B (0.21487603 0.78512397) *
##        13) smoothness_mean>=-2.233059 125  31 B (0.24800000 0.75200000)  
##          26) texture_worst>=5.026995 6   0 M (1.00000000 0.00000000) *
##          27) texture_worst< 5.026995 119  25 B (0.21008403 0.78991597)  
##            54) smoothness_worst< -1.531558 6   0 M (1.00000000 0.00000000) *
##            55) smoothness_worst>=-1.531558 113  19 B (0.16814159 0.83185841)  
##             110) symmetry_worst>=-1.532237 17   8 M (0.52941176 0.47058824) *
##             111) symmetry_worst< -1.532237 96  10 B (0.10416667 0.89583333) *
##       7) smoothness_mean< -2.425205 207  55 B (0.26570048 0.73429952)  
##        14) smoothness_worst>=-1.653746 172  55 B (0.31976744 0.68023256)  
##          28) smoothness_worst< -1.576547 79  39 B (0.49367089 0.50632911)  
##            56) symmetry_worst>=-2.050548 64  25 M (0.60937500 0.39062500)  
##             112) compactness_se< -3.427985 56  17 M (0.69642857 0.30357143) *
##             113) compactness_se>=-3.427985 8   0 B (0.00000000 1.00000000) *
##            57) symmetry_worst< -2.050548 15   0 B (0.00000000 1.00000000) *
##          29) smoothness_worst>=-1.576547 93  16 B (0.17204301 0.82795699)  
##            58) texture_mean>=3.431166 3   0 M (1.00000000 0.00000000) *
##            59) texture_mean< 3.431166 90  13 B (0.14444444 0.85555556)  
##             118) texture_worst< 4.62656 40  12 B (0.30000000 0.70000000) *
##             119) texture_worst>=4.62656 50   1 B (0.02000000 0.98000000) *
##        15) smoothness_worst< -1.653746 35   0 B (0.00000000 1.00000000) *
## 
## $trees[[34]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 415 B (0.45504386 0.54495614)  
##     2) smoothness_worst< -1.54469 286 128 M (0.55244755 0.44755245)  
##       4) smoothness_worst>=-1.55958 84  16 M (0.80952381 0.19047619)  
##         8) compactness_se>=-4.694501 80  12 M (0.85000000 0.15000000)  
##          16) texture_mean< 3.353705 78  10 M (0.87179487 0.12820513)  
##            32) smoothness_mean< -2.313857 70   6 M (0.91428571 0.08571429)  
##              64) symmetry_worst< -1.801798 55   1 M (0.98181818 0.01818182) *
##              65) symmetry_worst>=-1.801798 15   5 M (0.66666667 0.33333333) *
##            33) smoothness_mean>=-2.313857 8   4 M (0.50000000 0.50000000)  
##              66) texture_mean< 3.271203 4   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=3.271203 4   0 B (0.00000000 1.00000000) *
##          17) texture_mean>=3.353705 2   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -4.694501 4   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.55958 202  90 B (0.44554455 0.55445545)  
##        10) smoothness_worst< -1.568787 170  83 M (0.51176471 0.48823529)  
##          20) smoothness_mean>=-2.337942 25   2 M (0.92000000 0.08000000)  
##            40) smoothness_mean< -2.294641 18   0 M (1.00000000 0.00000000) *
##            41) smoothness_mean>=-2.294641 7   2 M (0.71428571 0.28571429)  
##              82) smoothness_mean>=-2.214186 5   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean< -2.214186 2   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean< -2.337942 145  64 B (0.44137931 0.55862069)  
##            42) symmetry_worst>=-1.795801 43  12 M (0.72093023 0.27906977)  
##              84) texture_mean>=2.939162 33   5 M (0.84848485 0.15151515) *
##              85) texture_mean< 2.939162 10   3 B (0.30000000 0.70000000) *
##            43) symmetry_worst< -1.795801 102  33 B (0.32352941 0.67647059)  
##              86) smoothness_worst< -1.694089 7   0 M (1.00000000 0.00000000) *
##              87) smoothness_worst>=-1.694089 95  26 B (0.27368421 0.72631579) *
##        11) smoothness_worst>=-1.568787 32   3 B (0.09375000 0.90625000)  
##          22) compactness_se>=-2.682598 2   0 M (1.00000000 0.00000000) *
##          23) compactness_se< -2.682598 30   1 B (0.03333333 0.96666667)  
##            46) smoothness_mean>=-2.296106 1   0 M (1.00000000 0.00000000) *
##            47) smoothness_mean< -2.296106 29   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst>=-1.54469 626 257 B (0.41054313 0.58945687)  
##       6) smoothness_worst>=-1.501069 464 215 B (0.46336207 0.53663793)  
##        12) texture_mean< 3.039744 358 173 M (0.51675978 0.48324022)  
##          24) texture_mean>=2.967331 101  20 M (0.80198020 0.19801980)  
##            48) symmetry_worst>=-1.839419 74   6 M (0.91891892 0.08108108)  
##              96) symmetry_worst< -1.471051 68   2 M (0.97058824 0.02941176) *
##              97) symmetry_worst>=-1.471051 6   2 B (0.33333333 0.66666667) *
##            49) symmetry_worst< -1.839419 27  13 B (0.48148148 0.51851852)  
##              98) symmetry_worst< -1.878579 13   0 M (1.00000000 0.00000000) *
##              99) symmetry_worst>=-1.878579 14   0 B (0.00000000 1.00000000) *
##          25) texture_mean< 2.967331 257 104 B (0.40466926 0.59533074)  
##            50) smoothness_mean< -2.267218 116  46 M (0.60344828 0.39655172)  
##             100) texture_worst< 4.543572 76  17 M (0.77631579 0.22368421) *
##             101) texture_worst>=4.543572 40  11 B (0.27500000 0.72500000) *
##            51) smoothness_mean>=-2.267218 141  34 B (0.24113475 0.75886525)  
##             102) symmetry_worst>=-1.74232 74  33 B (0.44594595 0.55405405) *
##             103) symmetry_worst< -1.74232 67   1 B (0.01492537 0.98507463) *
##        13) texture_mean>=3.039744 106  30 B (0.28301887 0.71698113)  
##          26) compactness_se>=-3.334337 16   2 M (0.87500000 0.12500000)  
##            52) smoothness_mean>=-2.420336 14   0 M (1.00000000 0.00000000) *
##            53) smoothness_mean< -2.420336 2   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -3.334337 90  16 B (0.17777778 0.82222222)  
##            54) symmetry_worst< -1.822663 17   8 M (0.52941176 0.47058824)  
##             108) compactness_se< -3.615775 8   0 M (1.00000000 0.00000000) *
##             109) compactness_se>=-3.615775 9   1 B (0.11111111 0.88888889) *
##            55) symmetry_worst>=-1.822663 73   7 B (0.09589041 0.90410959)  
##             110) smoothness_worst< -1.483884 4   0 M (1.00000000 0.00000000) *
##             111) smoothness_worst>=-1.483884 69   3 B (0.04347826 0.95652174) *
##       7) smoothness_worst< -1.501069 162  42 B (0.25925926 0.74074074)  
##        14) texture_worst>=4.536474 107  39 B (0.36448598 0.63551402)  
##          28) smoothness_worst< -1.510008 79  38 B (0.48101266 0.51898734)  
##            56) texture_worst< 4.774294 41  13 M (0.68292683 0.31707317)  
##             112) smoothness_mean>=-2.405234 34   6 M (0.82352941 0.17647059) *
##             113) smoothness_mean< -2.405234 7   0 B (0.00000000 1.00000000) *
##            57) texture_worst>=4.774294 38  10 B (0.26315789 0.73684211)  
##             114) smoothness_worst>=-1.52112 7   0 M (1.00000000 0.00000000) *
##             115) smoothness_worst< -1.52112 31   3 B (0.09677419 0.90322581) *
##          29) smoothness_worst>=-1.510008 28   1 B (0.03571429 0.96428571)  
##            58) compactness_se>=-3.44344 3   1 B (0.33333333 0.66666667)  
##             116) texture_mean< 3.145585 1   0 M (1.00000000 0.00000000) *
##             117) texture_mean>=3.145585 2   0 B (0.00000000 1.00000000) *
##            59) compactness_se< -3.44344 25   0 B (0.00000000 1.00000000) *
##        15) texture_worst< 4.536474 55   3 B (0.05454545 0.94545455)  
##          30) smoothness_mean>=-2.194169 3   1 M (0.66666667 0.33333333)  
##            60) texture_mean>=2.816952 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.816952 1   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean< -2.194169 52   1 B (0.01923077 0.98076923)  
##            62) smoothness_worst< -1.541278 2   1 M (0.50000000 0.50000000)  
##             124) texture_mean>=2.773152 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 2.773152 1   0 B (0.00000000 1.00000000) *
##            63) smoothness_worst>=-1.541278 50   0 B (0.00000000 1.00000000) *
## 
## $trees[[35]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 408 B (0.44736842 0.55263158)  
##     2) compactness_se>=-3.721197 402 177 M (0.55970149 0.44029851)  
##       4) compactness_se< -3.575734 76  10 M (0.86842105 0.13157895)  
##         8) texture_mean>=2.647471 73   7 M (0.90410959 0.09589041)  
##          16) texture_mean< 3.07431 54   0 M (1.00000000 0.00000000) *
##          17) texture_mean>=3.07431 19   7 M (0.63157895 0.36842105)  
##            34) symmetry_worst>=-1.999513 14   2 M (0.85714286 0.14285714)  
##              68) texture_mean>=3.087384 12   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 3.087384 2   0 B (0.00000000 1.00000000) *
##            35) symmetry_worst< -1.999513 5   0 B (0.00000000 1.00000000) *
##         9) texture_mean< 2.647471 3   0 B (0.00000000 1.00000000) *
##       5) compactness_se>=-3.575734 326 159 B (0.48773006 0.51226994)  
##        10) smoothness_worst>=-1.618016 298 141 M (0.52684564 0.47315436)  
##          20) texture_worst>=5.016194 18   0 M (1.00000000 0.00000000) *
##          21) texture_worst< 5.016194 280 139 B (0.49642857 0.50357143)  
##            42) smoothness_worst< -1.595509 16   1 M (0.93750000 0.06250000)  
##              84) compactness_se< -3.215213 14   0 M (1.00000000 0.00000000) *
##              85) compactness_se>=-3.215213 2   1 M (0.50000000 0.50000000) *
##            43) smoothness_worst>=-1.595509 264 124 B (0.46969697 0.53030303)  
##              86) smoothness_mean>=-2.294122 142  60 M (0.57746479 0.42253521) *
##              87) smoothness_mean< -2.294122 122  42 B (0.34426230 0.65573770) *
##        11) smoothness_worst< -1.618016 28   2 B (0.07142857 0.92857143)  
##          22) smoothness_worst< -1.707409 3   1 M (0.66666667 0.33333333)  
##            44) texture_mean< 3.103494 2   0 M (1.00000000 0.00000000) *
##            45) texture_mean>=3.103494 1   0 B (0.00000000 1.00000000) *
##          23) smoothness_worst>=-1.707409 25   0 B (0.00000000 1.00000000) *
##     3) compactness_se< -3.721197 510 183 B (0.35882353 0.64117647)  
##       6) texture_worst>=4.507583 346 148 B (0.42774566 0.57225434)  
##        12) smoothness_mean>=-2.407891 244 119 M (0.51229508 0.48770492)  
##          24) smoothness_mean< -2.382983 34   4 M (0.88235294 0.11764706)  
##            48) texture_mean>=2.920077 32   2 M (0.93750000 0.06250000)  
##              96) symmetry_worst>=-2.212871 30   0 M (1.00000000 0.00000000) *
##              97) symmetry_worst< -2.212871 2   0 B (0.00000000 1.00000000) *
##            49) texture_mean< 2.920077 2   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean>=-2.382983 210  95 B (0.45238095 0.54761905)  
##            50) texture_mean< 2.903338 56  15 M (0.73214286 0.26785714)  
##             100) texture_worst< 4.637071 43   5 M (0.88372093 0.11627907) *
##             101) texture_worst>=4.637071 13   3 B (0.23076923 0.76923077) *
##            51) texture_mean>=2.903338 154  54 B (0.35064935 0.64935065)  
##             102) smoothness_worst>=-1.472892 70  30 M (0.57142857 0.42857143) *
##             103) smoothness_worst< -1.472892 84  14 B (0.16666667 0.83333333) *
##        13) smoothness_mean< -2.407891 102  23 B (0.22549020 0.77450980)  
##          26) texture_worst>=4.853342 44  17 B (0.38636364 0.61363636)  
##            52) texture_worst< 4.985267 19   6 M (0.68421053 0.31578947)  
##             104) texture_mean< 3.162414 13   0 M (1.00000000 0.00000000) *
##             105) texture_mean>=3.162414 6   0 B (0.00000000 1.00000000) *
##            53) texture_worst>=4.985267 25   4 B (0.16000000 0.84000000)  
##             106) smoothness_mean>=-2.427246 3   1 M (0.66666667 0.33333333) *
##             107) smoothness_mean< -2.427246 22   2 B (0.09090909 0.90909091) *
##          27) texture_worst< 4.853342 58   6 B (0.10344828 0.89655172)  
##            54) symmetry_worst>=-1.541105 17   6 B (0.35294118 0.64705882)  
##             108) smoothness_mean< -2.449189 4   0 M (1.00000000 0.00000000) *
##             109) smoothness_mean>=-2.449189 13   2 B (0.15384615 0.84615385) *
##            55) symmetry_worst< -1.541105 41   0 B (0.00000000 1.00000000) *
##       7) texture_worst< 4.507583 164  35 B (0.21341463 0.78658537)  
##        14) smoothness_mean< -2.411844 42  18 B (0.42857143 0.57142857)  
##          28) symmetry_worst>=-1.963801 28  10 M (0.64285714 0.35714286)  
##            56) texture_worst< 4.465917 22   4 M (0.81818182 0.18181818)  
##             112) compactness_se>=-4.531581 19   1 M (0.94736842 0.05263158) *
##             113) compactness_se< -4.531581 3   0 B (0.00000000 1.00000000) *
##            57) texture_worst>=4.465917 6   0 B (0.00000000 1.00000000) *
##          29) symmetry_worst< -1.963801 14   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean>=-2.411844 122  17 B (0.13934426 0.86065574)  
##          30) symmetry_worst< -2.391709 5   0 M (1.00000000 0.00000000) *
##          31) symmetry_worst>=-2.391709 117  12 B (0.10256410 0.89743590)  
##            62) compactness_se>=-3.892047 22  10 B (0.45454545 0.54545455)  
##             124) smoothness_worst>=-1.482701 11   1 M (0.90909091 0.09090909) *
##             125) smoothness_worst< -1.482701 11   0 B (0.00000000 1.00000000) *
##            63) compactness_se< -3.892047 95   2 B (0.02105263 0.97894737)  
##             126) smoothness_worst>=-1.42613 3   1 M (0.66666667 0.33333333) *
##             127) smoothness_worst< -1.42613 92   0 B (0.00000000 1.00000000) *
## 
## $trees[[36]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 398 B (0.43640351 0.56359649)  
##     2) symmetry_worst>=-1.424186 75  22 M (0.70666667 0.29333333)  
##       4) texture_mean>=2.77286 58  11 M (0.81034483 0.18965517)  
##         8) smoothness_worst>=-1.49649 46   3 M (0.93478261 0.06521739)  
##          16) compactness_se< -2.524297 44   1 M (0.97727273 0.02272727)  
##            32) compactness_se>=-4.171724 40   0 M (1.00000000 0.00000000) *
##            33) compactness_se< -4.171724 4   1 M (0.75000000 0.25000000)  
##              66) texture_mean>=3.068796 3   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 3.068796 1   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-2.524297 2   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.49649 12   4 B (0.33333333 0.66666667)  
##          18) texture_mean>=3.126045 4   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 3.126045 8   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.77286 17   6 B (0.35294118 0.64705882)  
##        10) symmetry_worst>=-1.195967 6   0 M (1.00000000 0.00000000) *
##        11) symmetry_worst< -1.195967 11   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.424186 837 345 B (0.41218638 0.58781362)  
##       6) smoothness_mean< -2.23446 698 311 B (0.44555874 0.55444126)  
##        12) compactness_se>=-3.466377 133  50 M (0.62406015 0.37593985)  
##          24) compactness_se< -3.391153 58   7 M (0.87931034 0.12068966)  
##            48) smoothness_mean< -2.262968 55   4 M (0.92727273 0.07272727)  
##              96) smoothness_mean>=-2.562637 54   3 M (0.94444444 0.05555556) *
##              97) smoothness_mean< -2.562637 1   0 B (0.00000000 1.00000000) *
##            49) smoothness_mean>=-2.262968 3   0 B (0.00000000 1.00000000) *
##          25) compactness_se>=-3.391153 75  32 B (0.42666667 0.57333333)  
##            50) texture_mean>=3.038537 32   5 M (0.84375000 0.15625000)  
##             100) smoothness_worst>=-1.647098 28   2 M (0.92857143 0.07142857) *
##             101) smoothness_worst< -1.647098 4   1 B (0.25000000 0.75000000) *
##            51) texture_mean< 3.038537 43   5 B (0.11627907 0.88372093)  
##             102) smoothness_mean>=-2.242902 4   0 M (1.00000000 0.00000000) *
##             103) smoothness_mean< -2.242902 39   1 B (0.02564103 0.97435897) *
##        13) compactness_se< -3.466377 565 228 B (0.40353982 0.59646018)  
##          26) smoothness_mean>=-2.251418 27   5 M (0.81481481 0.18518519)  
##            52) smoothness_worst>=-1.46195 21   0 M (1.00000000 0.00000000) *
##            53) smoothness_worst< -1.46195 6   1 B (0.16666667 0.83333333)  
##             106) texture_mean>=3.037597 1   0 M (1.00000000 0.00000000) *
##             107) texture_mean< 3.037597 5   0 B (0.00000000 1.00000000) *
##          27) smoothness_mean< -2.251418 538 206 B (0.38289963 0.61710037)  
##            54) smoothness_worst>=-1.424105 15   1 M (0.93333333 0.06666667)  
##             108) smoothness_mean>=-2.397334 14   0 M (1.00000000 0.00000000) *
##             109) smoothness_mean< -2.397334 1   0 B (0.00000000 1.00000000) *
##            55) smoothness_worst< -1.424105 523 192 B (0.36711281 0.63288719)  
##             110) smoothness_mean< -2.299091 430 177 B (0.41162791 0.58837209) *
##             111) smoothness_mean>=-2.299091 93  15 B (0.16129032 0.83870968) *
##       7) smoothness_mean>=-2.23446 139  34 B (0.24460432 0.75539568)  
##        14) texture_mean>=3.209345 4   0 M (1.00000000 0.00000000) *
##        15) texture_mean< 3.209345 135  30 B (0.22222222 0.77777778)  
##          30) symmetry_worst>=-1.765259 62  23 B (0.37096774 0.62903226)  
##            60) smoothness_worst>=-1.468303 33  12 M (0.63636364 0.36363636)  
##             120) smoothness_worst< -1.423212 14   0 M (1.00000000 0.00000000) *
##             121) smoothness_worst>=-1.423212 19   7 B (0.36842105 0.63157895) *
##            61) smoothness_worst< -1.468303 29   2 B (0.06896552 0.93103448)  
##             122) texture_mean>=3.103097 2   0 M (1.00000000 0.00000000) *
##             123) texture_mean< 3.103097 27   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst< -1.765259 73   7 B (0.09589041 0.90410959)  
##            62) symmetry_worst< -2.354921 4   0 M (1.00000000 0.00000000) *
##            63) symmetry_worst>=-2.354921 69   3 B (0.04347826 0.95652174)  
##             126) compactness_se>=-3.02233 1   0 M (1.00000000 0.00000000) *
##             127) compactness_se< -3.02233 68   2 B (0.02941176 0.97058824) *
## 
## $trees[[37]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 425 M (0.53399123 0.46600877)  
##     2) texture_worst>=4.50835 620 244 M (0.60645161 0.39354839)  
##       4) smoothness_worst< -1.429075 556 204 M (0.63309353 0.36690647)  
##         8) smoothness_mean>=-2.566967 546 194 M (0.64468864 0.35531136)  
##          16) symmetry_worst< -1.750953 273  76 M (0.72161172 0.27838828)  
##            32) symmetry_worst>=-1.862978 66   2 M (0.96969697 0.03030303)  
##              64) compactness_se< -3.586422 52   0 M (1.00000000 0.00000000) *
##              65) compactness_se>=-3.586422 14   2 M (0.85714286 0.14285714) *
##            33) symmetry_worst< -1.862978 207  74 M (0.64251208 0.35748792)  
##              66) symmetry_worst< -1.931815 173  48 M (0.72254335 0.27745665) *
##              67) symmetry_worst>=-1.931815 34   8 B (0.23529412 0.76470588) *
##          17) symmetry_worst>=-1.750953 273 118 M (0.56776557 0.43223443)  
##            34) symmetry_worst>=-1.724518 249  97 M (0.61044177 0.38955823)  
##              68) compactness_se< -3.859436 135  35 M (0.74074074 0.25925926) *
##              69) compactness_se>=-3.859436 114  52 B (0.45614035 0.54385965) *
##            35) symmetry_worst< -1.724518 24   3 B (0.12500000 0.87500000)  
##              70) texture_worst>=5.020647 3   0 M (1.00000000 0.00000000) *
##              71) texture_worst< 5.020647 21   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.566967 10   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.429075 64  24 B (0.37500000 0.62500000)  
##        10) symmetry_worst>=-1.529476 16   1 M (0.93750000 0.06250000)  
##          20) smoothness_mean>=-2.347148 15   0 M (1.00000000 0.00000000) *
##          21) smoothness_mean< -2.347148 1   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -1.529476 48   9 B (0.18750000 0.81250000)  
##          22) texture_worst< 4.624204 7   1 M (0.85714286 0.14285714)  
##            44) texture_mean>=2.924481 6   0 M (1.00000000 0.00000000) *
##            45) texture_mean< 2.924481 1   0 B (0.00000000 1.00000000) *
##          23) texture_worst>=4.624204 41   3 B (0.07317073 0.92682927)  
##            46) texture_mean>=3.26885 1   0 M (1.00000000 0.00000000) *
##            47) texture_mean< 3.26885 40   2 B (0.05000000 0.95000000)  
##              94) texture_mean< 2.995481 12   2 B (0.16666667 0.83333333) *
##              95) texture_mean>=2.995481 28   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.50835 292 111 B (0.38013699 0.61986301)  
##       6) smoothness_mean< -2.267218 182  85 B (0.46703297 0.53296703)  
##        12) smoothness_worst>=-1.496838 49  14 M (0.71428571 0.28571429)  
##          24) compactness_se>=-4.122059 40   5 M (0.87500000 0.12500000)  
##            48) texture_worst< 4.487999 38   3 M (0.92105263 0.07894737)  
##              96) symmetry_worst< -1.440359 35   1 M (0.97142857 0.02857143) *
##              97) symmetry_worst>=-1.440359 3   1 B (0.33333333 0.66666667) *
##            49) texture_worst>=4.487999 2   0 B (0.00000000 1.00000000) *
##          25) compactness_se< -4.122059 9   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.496838 133  50 B (0.37593985 0.62406015)  
##          26) symmetry_worst< -1.698675 92  45 B (0.48913043 0.51086957)  
##            52) compactness_se>=-3.714667 34   7 M (0.79411765 0.20588235)  
##             104) smoothness_mean>=-2.456711 26   2 M (0.92307692 0.07692308) *
##             105) smoothness_mean< -2.456711 8   3 B (0.37500000 0.62500000) *
##            53) compactness_se< -3.714667 58  18 B (0.31034483 0.68965517)  
##             106) symmetry_worst>=-1.756915 5   0 M (1.00000000 0.00000000) *
##             107) symmetry_worst< -1.756915 53  13 B (0.24528302 0.75471698) *
##          27) symmetry_worst>=-1.698675 41   5 B (0.12195122 0.87804878)  
##            54) compactness_se< -4.281648 10   5 M (0.50000000 0.50000000)  
##             108) texture_mean>=2.975525 5   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 2.975525 5   0 B (0.00000000 1.00000000) *
##            55) compactness_se>=-4.281648 31   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean>=-2.267218 110  26 B (0.23636364 0.76363636)  
##        14) smoothness_mean>=-2.172878 34  15 M (0.55882353 0.44117647)  
##          28) smoothness_worst< -1.473124 11   0 M (1.00000000 0.00000000) *
##          29) smoothness_worst>=-1.473124 23   8 B (0.34782609 0.65217391)  
##            58) symmetry_worst>=-1.596878 10   2 M (0.80000000 0.20000000)  
##             116) smoothness_mean>=-2.087477 8   0 M (1.00000000 0.00000000) *
##             117) smoothness_mean< -2.087477 2   0 B (0.00000000 1.00000000) *
##            59) symmetry_worst< -1.596878 13   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.172878 76   7 B (0.09210526 0.90789474)  
##          30) compactness_se>=-3.084108 6   1 M (0.83333333 0.16666667)  
##            60) texture_mean< 2.81718 5   0 M (1.00000000 0.00000000) *
##            61) texture_mean>=2.81718 1   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.084108 70   2 B (0.02857143 0.97142857)  
##            62) smoothness_worst>=-1.429447 2   0 M (1.00000000 0.00000000) *
##            63) smoothness_worst< -1.429447 68   0 B (0.00000000 1.00000000) *
## 
## $trees[[38]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 449 B (0.49232456 0.50767544)  
##     2) symmetry_worst>=-1.815934 530 230 M (0.56603774 0.43396226)  
##       4) symmetry_worst< -1.809351 26   0 M (1.00000000 0.00000000) *
##       5) symmetry_worst>=-1.809351 504 230 M (0.54365079 0.45634921)  
##        10) compactness_se>=-4.547852 473 206 M (0.56448203 0.43551797)  
##          20) texture_mean>=2.929857 317 119 M (0.62460568 0.37539432)  
##            40) texture_worst< 4.786372 163  41 M (0.74846626 0.25153374)  
##              80) compactness_se>=-4.291103 157  35 M (0.77707006 0.22292994) *
##              81) compactness_se< -4.291103 6   0 B (0.00000000 1.00000000) *
##            41) texture_worst>=4.786372 154  76 B (0.49350649 0.50649351)  
##              82) texture_worst>=4.818867 130  55 M (0.57692308 0.42307692) *
##              83) texture_worst< 4.818867 24   1 B (0.04166667 0.95833333) *
##          21) texture_mean< 2.929857 156  69 B (0.44230769 0.55769231)  
##            42) texture_mean< 2.899771 123  55 M (0.55284553 0.44715447)  
##              84) compactness_se>=-3.982052 93  31 M (0.66666667 0.33333333) *
##              85) compactness_se< -3.982052 30   6 B (0.20000000 0.80000000) *
##            43) texture_mean>=2.899771 33   1 B (0.03030303 0.96969697)  
##              86) symmetry_worst>=-1.383772 1   0 M (1.00000000 0.00000000) *
##              87) symmetry_worst< -1.383772 32   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -4.547852 31   7 B (0.22580645 0.77419355)  
##          22) texture_worst< 4.622562 11   4 M (0.63636364 0.36363636)  
##            44) texture_mean>=2.912851 7   0 M (1.00000000 0.00000000) *
##            45) texture_mean< 2.912851 4   0 B (0.00000000 1.00000000) *
##          23) texture_worst>=4.622562 20   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.815934 382 149 B (0.39005236 0.60994764)  
##       6) texture_worst>=4.897936 87  31 M (0.64367816 0.35632184)  
##        12) symmetry_worst>=-2.207988 69  14 M (0.79710145 0.20289855)  
##          24) texture_mean< 3.361554 63   8 M (0.87301587 0.12698413)  
##            48) compactness_se>=-4.758524 59   4 M (0.93220339 0.06779661)  
##              96) smoothness_worst>=-1.62752 57   2 M (0.96491228 0.03508772) *
##              97) smoothness_worst< -1.62752 2   0 B (0.00000000 1.00000000) *
##            49) compactness_se< -4.758524 4   0 B (0.00000000 1.00000000) *
##          25) texture_mean>=3.361554 6   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst< -2.207988 18   1 B (0.05555556 0.94444444)  
##          26) smoothness_mean>=-2.282229 1   0 M (1.00000000 0.00000000) *
##          27) smoothness_mean< -2.282229 17   0 B (0.00000000 1.00000000) *
##       7) texture_worst< 4.897936 295  93 B (0.31525424 0.68474576)  
##        14) smoothness_worst< -1.474648 228  90 B (0.39473684 0.60526316)  
##          28) smoothness_worst>=-1.482898 24   3 M (0.87500000 0.12500000)  
##            56) compactness_se>=-3.967101 22   1 M (0.95454545 0.04545455)  
##             112) texture_mean>=2.721909 21   0 M (1.00000000 0.00000000) *
##             113) texture_mean< 2.721909 1   0 B (0.00000000 1.00000000) *
##            57) compactness_se< -3.967101 2   0 B (0.00000000 1.00000000) *
##          29) smoothness_worst< -1.482898 204  69 B (0.33823529 0.66176471)  
##            58) texture_worst>=4.575448 70  34 M (0.51428571 0.48571429)  
##             116) smoothness_worst>=-1.580846 51  15 M (0.70588235 0.29411765) *
##             117) smoothness_worst< -1.580846 19   0 B (0.00000000 1.00000000) *
##            59) texture_worst< 4.575448 134  33 B (0.24626866 0.75373134)  
##             118) smoothness_worst< -1.595961 47  22 M (0.53191489 0.46808511) *
##             119) smoothness_worst>=-1.595961 87   8 B (0.09195402 0.90804598) *
##        15) smoothness_worst>=-1.474648 67   3 B (0.04477612 0.95522388)  
##          30) smoothness_mean< -2.352223 8   3 B (0.37500000 0.62500000)  
##            60) texture_mean>=2.830895 3   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.830895 5   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean>=-2.352223 59   0 B (0.00000000 1.00000000) *
## 
## $trees[[39]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 380 B (0.41666667 0.58333333)  
##     2) smoothness_mean>=-2.423454 691 318 B (0.46020260 0.53979740)  
##       4) texture_worst>=4.389172 505 243 M (0.51881188 0.48118812)  
##         8) texture_mean< 3.35917 484 222 M (0.54132231 0.45867769)  
##          16) smoothness_mean< -2.093138 465 203 M (0.56344086 0.43655914)  
##            32) symmetry_worst>=-2.052205 355 134 M (0.62253521 0.37746479)  
##              64) texture_mean>=3.054236 107  22 M (0.79439252 0.20560748) *
##              65) texture_mean< 3.054236 248 112 M (0.54838710 0.45161290) *
##            33) symmetry_worst< -2.052205 110  41 B (0.37272727 0.62727273)  
##              66) symmetry_worst< -2.115205 77  36 M (0.53246753 0.46753247) *
##              67) symmetry_worst>=-2.115205 33   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean>=-2.093138 19   0 B (0.00000000 1.00000000) *
##         9) texture_mean>=3.35917 21   0 B (0.00000000 1.00000000) *
##       5) texture_worst< 4.389172 186  56 B (0.30107527 0.69892473)  
##        10) texture_worst< 4.313991 135  50 B (0.37037037 0.62962963)  
##          20) symmetry_worst>=-1.828847 77  38 M (0.50649351 0.49350649)  
##            40) compactness_se>=-3.647113 36   9 M (0.75000000 0.25000000)  
##              80) texture_mean< 2.801532 25   2 M (0.92000000 0.08000000) *
##              81) texture_mean>=2.801532 11   4 B (0.36363636 0.63636364) *
##            41) compactness_se< -3.647113 41  12 B (0.29268293 0.70731707)  
##              82) smoothness_mean< -2.411844 9   1 M (0.88888889 0.11111111) *
##              83) smoothness_mean>=-2.411844 32   4 B (0.12500000 0.87500000) *
##          21) symmetry_worst< -1.828847 58  11 B (0.18965517 0.81034483)  
##            42) compactness_se< -3.57366 23  11 B (0.47826087 0.52173913)  
##              84) smoothness_worst>=-1.499666 10   1 M (0.90000000 0.10000000) *
##              85) smoothness_worst< -1.499666 13   2 B (0.15384615 0.84615385) *
##            43) compactness_se>=-3.57366 35   0 B (0.00000000 1.00000000) *
##        11) texture_worst>=4.313991 51   6 B (0.11764706 0.88235294)  
##          22) smoothness_worst>=-1.428351 2   0 M (1.00000000 0.00000000) *
##          23) smoothness_worst< -1.428351 49   4 B (0.08163265 0.91836735)  
##            46) compactness_se>=-3.095053 8   3 B (0.37500000 0.62500000)  
##              92) texture_mean>=2.911683 3   0 M (1.00000000 0.00000000) *
##              93) texture_mean< 2.911683 5   0 B (0.00000000 1.00000000) *
##            47) compactness_se< -3.095053 41   1 B (0.02439024 0.97560976)  
##              94) symmetry_worst< -1.973981 8   1 B (0.12500000 0.87500000) *
##              95) symmetry_worst>=-1.973981 33   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.423454 221  62 B (0.28054299 0.71945701)  
##       6) symmetry_worst>=-1.496954 15   3 M (0.80000000 0.20000000)  
##        12) texture_mean>=2.977947 12   0 M (1.00000000 0.00000000) *
##        13) texture_mean< 2.977947 3   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.496954 206  50 B (0.24271845 0.75728155)  
##        14) texture_mean>=3.388429 7   0 M (1.00000000 0.00000000) *
##        15) texture_mean< 3.388429 199  43 B (0.21608040 0.78391960)  
##          30) compactness_se< -4.687525 23   9 M (0.60869565 0.39130435)  
##            60) compactness_se>=-4.706178 14   0 M (1.00000000 0.00000000) *
##            61) compactness_se< -4.706178 9   0 B (0.00000000 1.00000000) *
##          31) compactness_se>=-4.687525 176  29 B (0.16477273 0.83522727)  
##            62) texture_worst< 3.96146 12   3 M (0.75000000 0.25000000)  
##             124) texture_mean>=2.754513 9   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 2.754513 3   0 B (0.00000000 1.00000000) *
##            63) texture_worst>=3.96146 164  20 B (0.12195122 0.87804878)  
##             126) smoothness_worst< -1.720903 4   1 M (0.75000000 0.25000000) *
##             127) smoothness_worst>=-1.720903 160  17 B (0.10625000 0.89375000) *
## 
## $trees[[40]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 348 B (0.38157895 0.61842105)  
##     2) symmetry_worst>=-1.366937 40  10 M (0.75000000 0.25000000)  
##       4) symmetry_worst< -1.23578 22   0 M (1.00000000 0.00000000) *
##       5) symmetry_worst>=-1.23578 18   8 B (0.44444444 0.55555556)  
##        10) smoothness_worst>=-1.451731 7   0 M (1.00000000 0.00000000) *
##        11) smoothness_worst< -1.451731 11   1 B (0.09090909 0.90909091)  
##          22) texture_mean>=3.158816 1   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.158816 10   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.366937 872 318 B (0.36467890 0.63532110)  
##       6) smoothness_worst< -1.439482 757 296 B (0.39101717 0.60898283)  
##        12) symmetry_worst>=-2.232873 679 283 B (0.41678940 0.58321060)  
##          24) symmetry_worst< -1.634569 467 218 B (0.46680942 0.53319058)  
##            48) smoothness_worst>=-1.604472 410 200 M (0.51219512 0.48780488)  
##              96) smoothness_mean< -2.349846 218  86 M (0.60550459 0.39449541) *
##              97) smoothness_mean>=-2.349846 192  78 B (0.40625000 0.59375000) *
##            49) smoothness_worst< -1.604472 57   8 B (0.14035088 0.85964912)  
##              98) texture_mean< 2.966301 13   5 B (0.38461538 0.61538462) *
##              99) texture_mean>=2.966301 44   3 B (0.06818182 0.93181818) *
##          25) symmetry_worst>=-1.634569 212  65 B (0.30660377 0.69339623)  
##            50) symmetry_worst>=-1.549706 109  48 B (0.44036697 0.55963303)  
##             100) smoothness_worst< -1.513087 42  13 M (0.69047619 0.30952381) *
##             101) smoothness_worst>=-1.513087 67  19 B (0.28358209 0.71641791) *
##            51) symmetry_worst< -1.549706 103  17 B (0.16504854 0.83495146)  
##             102) smoothness_worst>=-1.470752 10   2 M (0.80000000 0.20000000) *
##             103) smoothness_worst< -1.470752 93   9 B (0.09677419 0.90322581) *
##        13) symmetry_worst< -2.232873 78  13 B (0.16666667 0.83333333)  
##          26) smoothness_mean>=-2.307549 28  13 B (0.46428571 0.53571429)  
##            52) smoothness_worst< -1.59459 11   0 M (1.00000000 0.00000000) *
##            53) smoothness_worst>=-1.59459 17   2 B (0.11764706 0.88235294)  
##             106) texture_worst>=4.618547 2   0 M (1.00000000 0.00000000) *
##             107) texture_worst< 4.618547 15   0 B (0.00000000 1.00000000) *
##          27) smoothness_mean< -2.307549 50   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst>=-1.439482 115  22 B (0.19130435 0.80869565)  
##        14) smoothness_mean>=-2.079457 7   1 M (0.85714286 0.14285714)  
##          28) smoothness_mean< -1.889548 6   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-1.889548 1   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.079457 108  16 B (0.14814815 0.85185185)  
##          30) texture_mean>=3.242184 4   0 M (1.00000000 0.00000000) *
##          31) texture_mean< 3.242184 104  12 B (0.11538462 0.88461538)  
##            62) compactness_se< -4.38342 8   3 M (0.62500000 0.37500000)  
##             124) texture_mean>=2.904788 5   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 2.904788 3   0 B (0.00000000 1.00000000) *
##            63) compactness_se>=-4.38342 96   7 B (0.07291667 0.92708333)  
##             126) smoothness_worst>=-1.334845 1   0 M (1.00000000 0.00000000) *
##             127) smoothness_worst< -1.334845 95   6 B (0.06315789 0.93684211) *
## 
## $trees[[41]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 368 B (0.40350877 0.59649123)  
##     2) smoothness_worst< -1.434076 810 344 B (0.42469136 0.57530864)  
##       4) smoothness_worst>=-1.482699 221 102 M (0.53846154 0.46153846)  
##         8) symmetry_worst>=-1.721298 118  33 M (0.72033898 0.27966102)  
##          16) symmetry_worst< -1.244631 110  25 M (0.77272727 0.22727273)  
##            32) compactness_se>=-4.512898 104  19 M (0.81730769 0.18269231)  
##              64) compactness_se>=-3.465064 28   0 M (1.00000000 0.00000000) *
##              65) compactness_se< -3.465064 76  19 M (0.75000000 0.25000000) *
##            33) compactness_se< -4.512898 6   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst>=-1.244631 8   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -1.721298 103  34 B (0.33009709 0.66990291)  
##          18) smoothness_worst< -1.480334 14   0 M (1.00000000 0.00000000) *
##          19) smoothness_worst>=-1.480334 89  20 B (0.22471910 0.77528090)  
##            38) texture_worst>=4.877645 19   4 M (0.78947368 0.21052632)  
##              76) compactness_se>=-4.054503 15   0 M (1.00000000 0.00000000) *
##              77) compactness_se< -4.054503 4   0 B (0.00000000 1.00000000) *
##            39) texture_worst< 4.877645 70   5 B (0.07142857 0.92857143)  
##              78) smoothness_worst< -1.477193 5   2 B (0.40000000 0.60000000) *
##              79) smoothness_worst>=-1.477193 65   3 B (0.04615385 0.95384615) *
##       5) smoothness_worst< -1.482699 589 225 B (0.38200340 0.61799660)  
##        10) compactness_se< -4.49319 77  31 M (0.59740260 0.40259740)  
##          20) compactness_se>=-4.779408 65  19 M (0.70769231 0.29230769)  
##            40) texture_worst>=4.712681 30   1 M (0.96666667 0.03333333)  
##              80) texture_mean< 3.262924 29   0 M (1.00000000 0.00000000) *
##              81) texture_mean>=3.262924 1   0 B (0.00000000 1.00000000) *
##            41) texture_worst< 4.712681 35  17 B (0.48571429 0.51428571)  
##              82) smoothness_mean>=-2.422683 17   3 M (0.82352941 0.17647059) *
##              83) smoothness_mean< -2.422683 18   3 B (0.16666667 0.83333333) *
##          21) compactness_se< -4.779408 12   0 B (0.00000000 1.00000000) *
##        11) compactness_se>=-4.49319 512 179 B (0.34960938 0.65039062)  
##          22) compactness_se>=-3.721197 227 102 B (0.44933921 0.55066079)  
##            44) compactness_se< -3.657776 18   1 M (0.94444444 0.05555556)  
##              88) texture_worst< 4.828083 15   0 M (1.00000000 0.00000000) *
##              89) texture_worst>=4.828083 3   1 M (0.66666667 0.33333333) *
##            45) compactness_se>=-3.657776 209  85 B (0.40669856 0.59330144)  
##              90) texture_mean>=3.035431 87  36 M (0.58620690 0.41379310) *
##              91) texture_mean< 3.035431 122  34 B (0.27868852 0.72131148) *
##          23) compactness_se< -3.721197 285  77 B (0.27017544 0.72982456)  
##            46) compactness_se< -3.869459 226  77 B (0.34070796 0.65929204)  
##              92) texture_worst>=5.001873 32   8 M (0.75000000 0.25000000) *
##              93) texture_worst< 5.001873 194  53 B (0.27319588 0.72680412) *
##            47) compactness_se>=-3.869459 59   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst>=-1.434076 102  24 B (0.23529412 0.76470588)  
##       6) symmetry_worst>=-1.270655 5   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst< -1.270655 97  19 B (0.19587629 0.80412371)  
##        14) compactness_se>=-2.695649 2   0 M (1.00000000 0.00000000) *
##        15) compactness_se< -2.695649 95  17 B (0.17894737 0.82105263)  
##          30) compactness_se< -4.186419 5   2 M (0.60000000 0.40000000)  
##            60) texture_mean>=2.950291 3   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.950291 2   0 B (0.00000000 1.00000000) *
##          31) compactness_se>=-4.186419 90  14 B (0.15555556 0.84444444)  
##            62) compactness_se>=-3.998097 59  14 B (0.23728814 0.76271186)  
##             124) compactness_se< -3.768789 7   1 M (0.85714286 0.14285714) *
##             125) compactness_se>=-3.768789 52   8 B (0.15384615 0.84615385) *
##            63) compactness_se< -3.998097 31   0 B (0.00000000 1.00000000) *
## 
## $trees[[42]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 409 B (0.44846491 0.55153509)  
##     2) texture_worst< 4.067485 63  18 M (0.71428571 0.28571429)  
##       4) texture_mean>=2.650752 45   6 M (0.86666667 0.13333333)  
##         8) compactness_se< -3.48221 39   2 M (0.94871795 0.05128205)  
##          16) smoothness_mean>=-2.466148 38   1 M (0.97368421 0.02631579)  
##            32) compactness_se>=-4.044115 37   0 M (1.00000000 0.00000000) *
##            33) compactness_se< -4.044115 1   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.466148 1   0 B (0.00000000 1.00000000) *
##         9) compactness_se>=-3.48221 6   2 B (0.33333333 0.66666667)  
##          18) texture_mean< 2.690691 2   0 M (1.00000000 0.00000000) *
##          19) texture_mean>=2.690691 4   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.650752 18   6 B (0.33333333 0.66666667)  
##        10) texture_mean< 2.525679 6   0 M (1.00000000 0.00000000) *
##        11) texture_mean>=2.525679 12   0 B (0.00000000 1.00000000) *
##     3) texture_worst>=4.067485 849 364 B (0.42873969 0.57126031)  
##       6) symmetry_worst>=-2.01934 707 327 B (0.46251768 0.53748232)  
##        12) symmetry_worst< -1.990832 26   1 M (0.96153846 0.03846154)  
##          24) smoothness_mean< -2.261392 22   0 M (1.00000000 0.00000000) *
##          25) smoothness_mean>=-2.261392 4   1 M (0.75000000 0.25000000)  
##            50) texture_mean>=2.960755 3   0 M (1.00000000 0.00000000) *
##            51) texture_mean< 2.960755 1   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst>=-1.990832 681 302 B (0.44346549 0.55653451)  
##          26) texture_mean>=2.876103 534 261 B (0.48876404 0.51123596)  
##            52) texture_mean< 2.893423 33   2 M (0.93939394 0.06060606)  
##             104) symmetry_worst< -1.684827 30   0 M (1.00000000 0.00000000) *
##             105) symmetry_worst>=-1.684827 3   1 B (0.33333333 0.66666667) *
##            53) texture_mean>=2.893423 501 230 B (0.45908184 0.54091816)  
##             106) smoothness_mean< -2.114071 473 229 B (0.48414376 0.51585624) *
##             107) smoothness_mean>=-2.114071 28   1 B (0.03571429 0.96428571) *
##          27) texture_mean< 2.876103 147  41 B (0.27891156 0.72108844)  
##            54) smoothness_mean>=-2.15202 11   1 M (0.90909091 0.09090909)  
##             108) compactness_se>=-3.643968 10   0 M (1.00000000 0.00000000) *
##             109) compactness_se< -3.643968 1   0 B (0.00000000 1.00000000) *
##            55) smoothness_mean< -2.15202 136  31 B (0.22794118 0.77205882)  
##             110) texture_mean< 2.841997 86  29 B (0.33720930 0.66279070) *
##             111) texture_mean>=2.841997 50   2 B (0.04000000 0.96000000) *
##       7) symmetry_worst< -2.01934 142  37 B (0.26056338 0.73943662)  
##        14) smoothness_mean>=-2.394379 83  30 B (0.36144578 0.63855422)  
##          28) smoothness_mean< -2.382983 13   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-2.382983 70  17 B (0.24285714 0.75714286)  
##            58) compactness_se< -4.492707 5   0 M (1.00000000 0.00000000) *
##            59) compactness_se>=-4.492707 65  12 B (0.18461538 0.81538462)  
##             118) smoothness_worst>=-1.518057 31  12 B (0.38709677 0.61290323) *
##             119) smoothness_worst< -1.518057 34   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.394379 59   7 B (0.11864407 0.88135593)  
##          30) compactness_se>=-3.530926 10   3 M (0.70000000 0.30000000)  
##            60) compactness_se< -3.188171 6   0 M (1.00000000 0.00000000) *
##            61) compactness_se>=-3.188171 4   1 B (0.25000000 0.75000000)  
##             122) texture_worst< 4.59095 1   0 M (1.00000000 0.00000000) *
##             123) texture_worst>=4.59095 3   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.530926 49   0 B (0.00000000 1.00000000) *
## 
## $trees[[43]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 424 M (0.53508772 0.46491228)  
##     2) symmetry_worst>=-2.052205 766 326 M (0.57441253 0.42558747)  
##       4) smoothness_worst< -1.363268 747 310 M (0.58500669 0.41499331)  
##         8) compactness_se>=-3.721197 323 108 M (0.66563467 0.33436533)  
##          16) smoothness_mean>=-2.325189 173  39 M (0.77456647 0.22543353)  
##            32) compactness_se< -3.494301 68   4 M (0.94117647 0.05882353)  
##              64) texture_mean>=2.649195 65   1 M (0.98461538 0.01538462) *
##              65) texture_mean< 2.649195 3   0 B (0.00000000 1.00000000) *
##            33) compactness_se>=-3.494301 105  35 M (0.66666667 0.33333333)  
##              66) compactness_se>=-3.447524 93  23 M (0.75268817 0.24731183) *
##              67) compactness_se< -3.447524 12   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.325189 150  69 M (0.54000000 0.46000000)  
##            34) smoothness_mean< -2.326622 135  54 M (0.60000000 0.40000000)  
##              68) smoothness_worst>=-1.50451 32   3 M (0.90625000 0.09375000) *
##              69) smoothness_worst< -1.50451 103  51 M (0.50485437 0.49514563) *
##            35) smoothness_mean>=-2.326622 15   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -3.721197 424 202 M (0.52358491 0.47641509)  
##          18) compactness_se< -3.734107 402 180 M (0.55223881 0.44776119)  
##            36) smoothness_worst< -1.429075 368 154 M (0.58152174 0.41847826)  
##              72) smoothness_worst>=-1.446808 35   3 M (0.91428571 0.08571429) *
##              73) smoothness_worst< -1.446808 333 151 M (0.54654655 0.45345345) *
##            37) smoothness_worst>=-1.429075 34   8 B (0.23529412 0.76470588)  
##              74) smoothness_mean>=-2.15812 3   0 M (1.00000000 0.00000000) *
##              75) smoothness_mean< -2.15812 31   5 B (0.16129032 0.83870968) *
##          19) compactness_se>=-3.734107 22   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.363268 19   3 B (0.15789474 0.84210526)  
##        10) compactness_se< -3.488238 5   2 M (0.60000000 0.40000000)  
##          20) texture_mean>=2.688296 3   0 M (1.00000000 0.00000000) *
##          21) texture_mean< 2.688296 2   0 B (0.00000000 1.00000000) *
##        11) compactness_se>=-3.488238 14   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -2.052205 146  48 B (0.32876712 0.67123288)  
##       6) texture_worst< 5.15236 119  48 B (0.40336134 0.59663866)  
##        12) symmetry_worst< -2.111279 97  47 B (0.48453608 0.51546392)  
##          24) symmetry_worst>=-2.191305 32   8 M (0.75000000 0.25000000)  
##            48) smoothness_mean>=-2.408231 25   1 M (0.96000000 0.04000000)  
##              96) compactness_se< -3.495845 23   0 M (1.00000000 0.00000000) *
##              97) compactness_se>=-3.495845 2   1 M (0.50000000 0.50000000) *
##            49) smoothness_mean< -2.408231 7   0 B (0.00000000 1.00000000) *
##          25) symmetry_worst< -2.191305 65  23 B (0.35384615 0.64615385)  
##            50) smoothness_mean>=-2.225534 6   0 M (1.00000000 0.00000000) *
##            51) smoothness_mean< -2.225534 59  17 B (0.28813559 0.71186441)  
##             102) compactness_se< -4.564659 8   1 M (0.87500000 0.12500000) *
##             103) compactness_se>=-4.564659 51  10 B (0.19607843 0.80392157) *
##        13) symmetry_worst>=-2.111279 22   1 B (0.04545455 0.95454545)  
##          26) smoothness_mean< -2.576965 1   0 M (1.00000000 0.00000000) *
##          27) smoothness_mean>=-2.576965 21   0 B (0.00000000 1.00000000) *
##       7) texture_worst>=5.15236 27   0 B (0.00000000 1.00000000) *
## 
## $trees[[44]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 431 B (0.47258772 0.52741228)  
##     2) symmetry_worst>=-1.36527 34   3 M (0.91176471 0.08823529)  
##       4) smoothness_worst>=-1.49848 25   0 M (1.00000000 0.00000000) *
##       5) smoothness_worst< -1.49848 9   3 M (0.66666667 0.33333333)  
##        10) smoothness_mean< -2.372457 6   0 M (1.00000000 0.00000000) *
##        11) smoothness_mean>=-2.372457 3   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.36527 878 400 B (0.45558087 0.54441913)  
##       6) compactness_se< -3.355844 760 366 B (0.48157895 0.51842105)  
##        12) texture_worst< 4.740988 495 230 M (0.53535354 0.46464646)  
##          24) texture_worst>=4.737862 26   0 M (1.00000000 0.00000000) *
##          25) texture_worst< 4.737862 469 230 M (0.50959488 0.49040512)  
##            50) compactness_se>=-3.705619 178  62 M (0.65168539 0.34831461)  
##             100) smoothness_worst< -1.477785 136  30 M (0.77941176 0.22058824) *
##             101) smoothness_worst>=-1.477785 42  10 B (0.23809524 0.76190476) *
##            51) compactness_se< -3.705619 291 123 B (0.42268041 0.57731959)  
##             102) smoothness_worst>=-1.451541 42   9 M (0.78571429 0.21428571) *
##             103) smoothness_worst< -1.451541 249  90 B (0.36144578 0.63855422) *
##        13) texture_worst>=4.740988 265 101 B (0.38113208 0.61886792)  
##          26) smoothness_worst>=-1.52112 122  52 M (0.57377049 0.42622951)  
##            52) texture_worst>=4.821213 101  33 M (0.67326733 0.32673267)  
##             104) texture_worst< 5.163886 71  12 M (0.83098592 0.16901408) *
##             105) texture_worst>=5.163886 30   9 B (0.30000000 0.70000000) *
##            53) texture_worst< 4.821213 21   2 B (0.09523810 0.90476190)  
##             106) smoothness_mean>=-2.208092 2   0 M (1.00000000 0.00000000) *
##             107) smoothness_mean< -2.208092 19   0 B (0.00000000 1.00000000) *
##          27) smoothness_worst< -1.52112 143  31 B (0.21678322 0.78321678)  
##            54) compactness_se< -4.49816 25  10 M (0.60000000 0.40000000)  
##             108) compactness_se>=-4.706178 16   1 M (0.93750000 0.06250000) *
##             109) compactness_se< -4.706178 9   0 B (0.00000000 1.00000000) *
##            55) compactness_se>=-4.49816 118  16 B (0.13559322 0.86440678)  
##             110) smoothness_mean>=-2.369574 34  14 B (0.41176471 0.58823529) *
##             111) smoothness_mean< -2.369574 84   2 B (0.02380952 0.97619048) *
##       7) compactness_se>=-3.355844 118  34 B (0.28813559 0.71186441)  
##        14) texture_mean>=3.039251 38  11 M (0.71052632 0.28947368)  
##          28) texture_worst< 4.980233 21   1 M (0.95238095 0.04761905)  
##            56) smoothness_mean>=-2.479915 19   0 M (1.00000000 0.00000000) *
##            57) smoothness_mean< -2.479915 2   1 M (0.50000000 0.50000000)  
##             114) texture_mean>=3.072866 1   0 M (1.00000000 0.00000000) *
##             115) texture_mean< 3.072866 1   0 B (0.00000000 1.00000000) *
##          29) texture_worst>=4.980233 17   7 B (0.41176471 0.58823529)  
##            58) texture_worst>=5.016194 6   0 M (1.00000000 0.00000000) *
##            59) texture_worst< 5.016194 11   1 B (0.09090909 0.90909091)  
##             118) texture_mean< 3.154646 1   0 M (1.00000000 0.00000000) *
##             119) texture_mean>=3.154646 10   0 B (0.00000000 1.00000000) *
##        15) texture_mean< 3.039251 80   7 B (0.08750000 0.91250000)  
##          30) smoothness_worst>=-1.387398 7   2 M (0.71428571 0.28571429)  
##            60) texture_mean>=2.701935 5   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.701935 2   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst< -1.387398 73   2 B (0.02739726 0.97260274)  
##            62) texture_mean>=3.031099 5   1 B (0.20000000 0.80000000)  
##             124) texture_mean< 3.032546 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean>=3.032546 4   0 B (0.00000000 1.00000000) *
##            63) texture_mean< 3.031099 68   1 B (0.01470588 0.98529412)  
##             126) symmetry_worst>=-1.474719 9   1 B (0.11111111 0.88888889) *
##             127) symmetry_worst< -1.474719 59   0 B (0.00000000 1.00000000) *
## 
## $trees[[45]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 M (0.51535088 0.48464912)  
##     2) texture_worst< 4.642157 551 235 M (0.57350272 0.42649728)  
##       4) texture_mean>=2.96604 155  36 M (0.76774194 0.23225806)  
##         8) texture_worst>=4.354728 150  31 M (0.79333333 0.20666667)  
##          16) compactness_se< -2.807696 141  25 M (0.82269504 0.17730496)  
##            32) compactness_se>=-4.291103 127  18 M (0.85826772 0.14173228)  
##              64) texture_mean< 3.138519 124  15 M (0.87903226 0.12096774) *
##              65) texture_mean>=3.138519 3   0 B (0.00000000 1.00000000) *
##            33) compactness_se< -4.291103 14   7 M (0.50000000 0.50000000)  
##              66) symmetry_worst< -1.7426 9   2 M (0.77777778 0.22222222) *
##              67) symmetry_worst>=-1.7426 5   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-2.807696 9   3 B (0.33333333 0.66666667)  
##            34) texture_mean>=3.003947 3   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.003947 6   0 B (0.00000000 1.00000000) *
##         9) texture_worst< 4.354728 5   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.96604 396 197 B (0.49747475 0.50252525)  
##        10) smoothness_worst< -1.470556 309 139 M (0.55016181 0.44983819)  
##          20) texture_mean< 2.940483 268 106 M (0.60447761 0.39552239)  
##            40) compactness_se>=-4.701576 257  95 M (0.63035019 0.36964981)  
##              80) smoothness_mean>=-2.50355 248  86 M (0.65322581 0.34677419) *
##              81) smoothness_mean< -2.50355 9   0 B (0.00000000 1.00000000) *
##            41) compactness_se< -4.701576 11   0 B (0.00000000 1.00000000) *
##          21) texture_mean>=2.940483 41   8 B (0.19512195 0.80487805)  
##            42) smoothness_worst>=-1.506664 11   3 M (0.72727273 0.27272727)  
##              84) texture_mean>=2.955938 8   0 M (1.00000000 0.00000000) *
##              85) texture_mean< 2.955938 3   0 B (0.00000000 1.00000000) *
##            43) smoothness_worst< -1.506664 30   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst>=-1.470556 87  27 B (0.31034483 0.68965517)  
##          22) smoothness_worst>=-1.448697 51  25 M (0.50980392 0.49019608)  
##            44) smoothness_worst< -1.440419 18   2 M (0.88888889 0.11111111)  
##              88) texture_mean>=2.801549 16   0 M (1.00000000 0.00000000) *
##              89) texture_mean< 2.801549 2   0 B (0.00000000 1.00000000) *
##            45) smoothness_worst>=-1.440419 33  10 B (0.30303030 0.69696970)  
##              90) compactness_se>=-3.300819 5   0 M (1.00000000 0.00000000) *
##              91) compactness_se< -3.300819 28   5 B (0.17857143 0.82142857) *
##          23) smoothness_worst< -1.448697 36   1 B (0.02777778 0.97222222)  
##            46) texture_worst< 4.017902 2   1 M (0.50000000 0.50000000)  
##              92) texture_mean>=2.587173 1   0 M (1.00000000 0.00000000) *
##              93) texture_mean< 2.587173 1   0 B (0.00000000 1.00000000) *
##            47) texture_worst>=4.017902 34   0 B (0.00000000 1.00000000) *
##     3) texture_worst>=4.642157 361 154 B (0.42659280 0.57340720)  
##       6) symmetry_worst>=-1.551105 74  24 M (0.67567568 0.32432432)  
##        12) compactness_se>=-4.507761 68  18 M (0.73529412 0.26470588)  
##          24) compactness_se< -3.135699 58  11 M (0.81034483 0.18965517)  
##            48) texture_worst>=4.82155 27   0 M (1.00000000 0.00000000) *
##            49) texture_worst< 4.82155 31  11 M (0.64516129 0.35483871)  
##              98) texture_worst< 4.789782 20   0 M (1.00000000 0.00000000) *
##              99) texture_worst>=4.789782 11   0 B (0.00000000 1.00000000) *
##          25) compactness_se>=-3.135699 10   3 B (0.30000000 0.70000000)  
##            50) smoothness_mean>=-2.336585 3   0 M (1.00000000 0.00000000) *
##            51) smoothness_mean< -2.336585 7   0 B (0.00000000 1.00000000) *
##        13) compactness_se< -4.507761 6   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.551105 287 104 B (0.36236934 0.63763066)  
##        14) compactness_se< -4.434687 31   8 M (0.74193548 0.25806452)  
##          28) symmetry_worst>=-1.912217 23   2 M (0.91304348 0.08695652)  
##            56) texture_mean>=2.915217 21   0 M (1.00000000 0.00000000) *
##            57) texture_mean< 2.915217 2   0 B (0.00000000 1.00000000) *
##          29) symmetry_worst< -1.912217 8   2 B (0.25000000 0.75000000)  
##            58) smoothness_worst>=-1.555243 3   1 M (0.66666667 0.33333333)  
##             116) texture_mean< 3.213341 2   0 M (1.00000000 0.00000000) *
##             117) texture_mean>=3.213341 1   0 B (0.00000000 1.00000000) *
##            59) smoothness_worst< -1.555243 5   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.434687 256  81 B (0.31640625 0.68359375)  
##          30) smoothness_mean>=-2.394379 176  69 B (0.39204545 0.60795455)  
##            60) smoothness_worst< -1.484675 83  34 M (0.59036145 0.40963855)  
##             120) symmetry_worst>=-2.207988 67  18 M (0.73134328 0.26865672) *
##             121) symmetry_worst< -2.207988 16   0 B (0.00000000 1.00000000) *
##            61) smoothness_worst>=-1.484675 93  20 B (0.21505376 0.78494624)  
##             122) compactness_se>=-3.300427 7   0 M (1.00000000 0.00000000) *
##             123) compactness_se< -3.300427 86  13 B (0.15116279 0.84883721) *
##          31) smoothness_mean< -2.394379 80  12 B (0.15000000 0.85000000)  
##            62) smoothness_worst>=-1.522574 20   9 B (0.45000000 0.55000000)  
##             124) smoothness_mean< -2.439503 11   2 M (0.81818182 0.18181818) *
##             125) smoothness_mean>=-2.439503 9   0 B (0.00000000 1.00000000) *
##            63) smoothness_worst< -1.522574 60   3 B (0.05000000 0.95000000)  
##             126) symmetry_worst< -2.242858 1   0 M (1.00000000 0.00000000) *
##             127) symmetry_worst>=-2.242858 59   2 B (0.03389831 0.96610169) *
## 
## $trees[[46]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 425 B (0.46600877 0.53399123)  
##     2) smoothness_mean>=-2.367658 500 235 M (0.53000000 0.47000000)  
##       4) smoothness_mean< -2.349943 56   8 M (0.85714286 0.14285714)  
##         8) symmetry_worst< -1.528454 49   1 M (0.97959184 0.02040816)  
##          16) compactness_se>=-4.721249 48   0 M (1.00000000 0.00000000) *
##          17) compactness_se< -4.721249 1   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.528454 7   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.349943 444 217 B (0.48873874 0.51126126)  
##        10) compactness_se>=-4.025757 329 141 M (0.57142857 0.42857143)  
##          20) compactness_se< -3.494301 211  65 M (0.69194313 0.30805687)  
##            40) texture_worst>=4.907333 47   2 M (0.95744681 0.04255319)  
##              80) symmetry_worst>=-2.207988 45   0 M (1.00000000 0.00000000) *
##              81) symmetry_worst< -2.207988 2   0 B (0.00000000 1.00000000) *
##            41) texture_worst< 4.907333 164  63 M (0.61585366 0.38414634)  
##              82) texture_worst< 4.608306 139  43 M (0.69064748 0.30935252) *
##              83) texture_worst>=4.608306 25   5 B (0.20000000 0.80000000) *
##          21) compactness_se>=-3.494301 118  42 B (0.35593220 0.64406780)  
##            42) compactness_se>=-3.445472 87  42 B (0.48275862 0.51724138)  
##              84) texture_worst>=4.558285 30   7 M (0.76666667 0.23333333) *
##              85) texture_worst< 4.558285 57  19 B (0.33333333 0.66666667) *
##            43) compactness_se< -3.445472 31   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -4.025757 115  29 B (0.25217391 0.74782609)  
##          22) smoothness_mean< -2.290664 41  13 M (0.68292683 0.31707317)  
##            44) smoothness_mean>=-2.333927 34   6 M (0.82352941 0.17647059)  
##              88) compactness_se>=-4.656191 31   3 M (0.90322581 0.09677419) *
##              89) compactness_se< -4.656191 3   0 B (0.00000000 1.00000000) *
##            45) smoothness_mean< -2.333927 7   0 B (0.00000000 1.00000000) *
##          23) smoothness_mean>=-2.290664 74   1 B (0.01351351 0.98648649)  
##            46) smoothness_mean>=-2.21595 16   1 B (0.06250000 0.93750000)  
##              92) smoothness_mean< -2.210016 2   1 M (0.50000000 0.50000000) *
##              93) smoothness_mean>=-2.210016 14   0 B (0.00000000 1.00000000) *
##            47) smoothness_mean< -2.21595 58   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.367658 412 160 B (0.38834951 0.61165049)  
##       6) compactness_se< -4.098964 197  94 M (0.52284264 0.47715736)  
##        12) compactness_se>=-4.260936 51   4 M (0.92156863 0.07843137)  
##          24) smoothness_worst>=-1.667778 49   2 M (0.95918367 0.04081633)  
##            48) smoothness_worst< -1.458133 48   1 M (0.97916667 0.02083333)  
##              96) smoothness_mean>=-2.440377 41   0 M (1.00000000 0.00000000) *
##              97) smoothness_mean< -2.440377 7   1 M (0.85714286 0.14285714) *
##            49) smoothness_worst>=-1.458133 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst< -1.667778 2   0 B (0.00000000 1.00000000) *
##        13) compactness_se< -4.260936 146  56 B (0.38356164 0.61643836)  
##          26) compactness_se< -4.658767 28   6 M (0.78571429 0.21428571)  
##            52) compactness_se>=-4.737326 22   0 M (1.00000000 0.00000000) *
##            53) compactness_se< -4.737326 6   0 B (0.00000000 1.00000000) *
##          27) compactness_se>=-4.658767 118  34 B (0.28813559 0.71186441)  
##            54) compactness_se>=-4.356557 53  26 B (0.49056604 0.50943396)  
##             108) smoothness_mean< -2.45841 24   3 M (0.87500000 0.12500000) *
##             109) smoothness_mean>=-2.45841 29   5 B (0.17241379 0.82758621) *
##            55) compactness_se< -4.356557 65   8 B (0.12307692 0.87692308)  
##             110) texture_mean>=2.985131 30   8 B (0.26666667 0.73333333) *
##             111) texture_mean< 2.985131 35   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-4.098964 215  57 B (0.26511628 0.73488372)  
##        14) texture_worst>=4.569119 116  46 B (0.39655172 0.60344828)  
##          28) texture_worst< 4.683744 43  13 M (0.69767442 0.30232558)  
##            56) smoothness_worst< -1.532274 26   3 M (0.88461538 0.11538462)  
##             112) texture_mean< 3.146714 24   1 M (0.95833333 0.04166667) *
##             113) texture_mean>=3.146714 2   0 B (0.00000000 1.00000000) *
##            57) smoothness_worst>=-1.532274 17   7 B (0.41176471 0.58823529)  
##             114) texture_mean>=2.955415 7   0 M (1.00000000 0.00000000) *
##             115) texture_mean< 2.955415 10   0 B (0.00000000 1.00000000) *
##          29) texture_worst>=4.683744 73  16 B (0.21917808 0.78082192)  
##            58) texture_mean>=3.428781 3   0 M (1.00000000 0.00000000) *
##            59) texture_mean< 3.428781 70  13 B (0.18571429 0.81428571)  
##             118) symmetry_worst>=-1.541072 5   1 M (0.80000000 0.20000000) *
##             119) symmetry_worst< -1.541072 65   9 B (0.13846154 0.86153846) *
##        15) texture_worst< 4.569119 99  11 B (0.11111111 0.88888889)  
##          30) smoothness_mean>=-2.376139 4   0 M (1.00000000 0.00000000) *
##          31) smoothness_mean< -2.376139 95   7 B (0.07368421 0.92631579)  
##            62) smoothness_worst>=-1.455747 2   0 M (1.00000000 0.00000000) *
##            63) smoothness_worst< -1.455747 93   5 B (0.05376344 0.94623656)  
##             126) texture_worst< 3.981964 11   4 B (0.36363636 0.63636364) *
##             127) texture_worst>=3.981964 82   1 B (0.01219512 0.98780488) *
## 
## $trees[[47]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 432 B (0.47368421 0.52631579)  
##     2) texture_mean>=2.960364 442 186 M (0.57918552 0.42081448)  
##       4) smoothness_mean< -2.106736 422 168 M (0.60189573 0.39810427)  
##         8) compactness_se>=-4.275512 330 116 M (0.64848485 0.35151515)  
##          16) smoothness_worst>=-1.618016 308 100 M (0.67532468 0.32467532)  
##            32) compactness_se< -4.096569 32   1 M (0.96875000 0.03125000)  
##              64) symmetry_worst>=-2.253809 31   0 M (1.00000000 0.00000000) *
##              65) symmetry_worst< -2.253809 1   0 B (0.00000000 1.00000000) *
##            33) compactness_se>=-4.096569 276  99 M (0.64130435 0.35869565)  
##              66) compactness_se>=-4.05446 262  85 M (0.67557252 0.32442748) *
##              67) compactness_se< -4.05446 14   0 B (0.00000000 1.00000000) *
##          17) smoothness_worst< -1.618016 22   6 B (0.27272727 0.72727273)  
##            34) compactness_se>=-3.004445 7   2 M (0.71428571 0.28571429)  
##              68) texture_mean>=3.038737 5   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 3.038737 2   0 B (0.00000000 1.00000000) *
##            35) compactness_se< -3.004445 15   1 B (0.06666667 0.93333333)  
##              70) compactness_se< -4.171292 1   0 M (1.00000000 0.00000000) *
##              71) compactness_se>=-4.171292 14   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -4.275512 92  40 B (0.43478261 0.56521739)  
##          18) smoothness_mean< -2.40097 51  17 M (0.66666667 0.33333333)  
##            36) symmetry_worst>=-1.953246 37   5 M (0.86486486 0.13513514)  
##              72) texture_mean< 3.192081 34   2 M (0.94117647 0.05882353) *
##              73) texture_mean>=3.192081 3   0 B (0.00000000 1.00000000) *
##            37) symmetry_worst< -1.953246 14   2 B (0.14285714 0.85714286)  
##              74) smoothness_worst>=-1.552639 2   0 M (1.00000000 0.00000000) *
##              75) smoothness_worst< -1.552639 12   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean>=-2.40097 41   6 B (0.14634146 0.85365854)  
##            38) smoothness_worst< -1.628375 2   0 M (1.00000000 0.00000000) *
##            39) smoothness_worst>=-1.628375 39   4 B (0.10256410 0.89743590)  
##              78) smoothness_worst>=-1.433001 1   0 M (1.00000000 0.00000000) *
##              79) smoothness_worst< -1.433001 38   3 B (0.07894737 0.92105263) *
##       5) smoothness_mean>=-2.106736 20   2 B (0.10000000 0.90000000)  
##        10) smoothness_mean>=-2.05387 2   0 M (1.00000000 0.00000000) *
##        11) smoothness_mean< -2.05387 18   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.960364 470 176 B (0.37446809 0.62553191)  
##       6) symmetry_worst>=-1.348749 24   4 M (0.83333333 0.16666667)  
##        12) compactness_se< -2.588521 22   2 M (0.90909091 0.09090909)  
##          24) smoothness_mean>=-2.360133 21   1 M (0.95238095 0.04761905)  
##            48) smoothness_mean< -2.022167 20   0 M (1.00000000 0.00000000) *
##            49) smoothness_mean>=-2.022167 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean< -2.360133 1   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-2.588521 2   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.348749 446 156 B (0.34977578 0.65022422)  
##        14) texture_mean>=2.708379 400 153 B (0.38250000 0.61750000)  
##          28) symmetry_worst< -1.707562 242 113 B (0.46694215 0.53305785)  
##            56) texture_worst< 4.54138 175  78 M (0.55428571 0.44571429)  
##             112) texture_worst>=4.507583 36   3 M (0.91666667 0.08333333) *
##             113) texture_worst< 4.507583 139  64 B (0.46043165 0.53956835) *
##            57) texture_worst>=4.54138 67  16 B (0.23880597 0.76119403)  
##             114) smoothness_mean< -2.350326 27  11 M (0.59259259 0.40740741) *
##             115) smoothness_mean>=-2.350326 40   0 B (0.00000000 1.00000000) *
##          29) symmetry_worst>=-1.707562 158  40 B (0.25316456 0.74683544)  
##            58) smoothness_mean>=-2.177741 21   3 M (0.85714286 0.14285714)  
##             116) texture_worst>=4.148692 18   0 M (1.00000000 0.00000000) *
##             117) texture_worst< 4.148692 3   0 B (0.00000000 1.00000000) *
##            59) smoothness_mean< -2.177741 137  22 B (0.16058394 0.83941606)  
##             118) smoothness_worst>=-1.482509 50  17 B (0.34000000 0.66000000) *
##             119) smoothness_worst< -1.482509 87   5 B (0.05747126 0.94252874) *
##        15) texture_mean< 2.708379 46   3 B (0.06521739 0.93478261)  
##          30) smoothness_mean>=-2.074653 7   3 B (0.42857143 0.57142857)  
##            60) smoothness_mean< -2.060513 3   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean>=-2.060513 4   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean< -2.074653 39   0 B (0.00000000 1.00000000) *
## 
## $trees[[48]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 430 B (0.47149123 0.52850877)  
##    2) texture_mean>=2.708379 870 425 B (0.48850575 0.51149425)  
##      4) texture_worst< 4.467083 259  96 M (0.62934363 0.37065637)  
##        8) smoothness_mean< -2.262885 205  60 M (0.70731707 0.29268293)  
##         16) smoothness_worst>=-1.637109 193  48 M (0.75129534 0.24870466)  
##           32) texture_worst>=4.254671 131  21 M (0.83969466 0.16030534)  
##             64) symmetry_worst< -1.403642 127  17 M (0.86614173 0.13385827) *
##             65) symmetry_worst>=-1.403642 4   0 B (0.00000000 1.00000000) *
##           33) texture_worst< 4.254671 62  27 M (0.56451613 0.43548387)  
##             66) texture_worst< 4.190306 49  14 M (0.71428571 0.28571429) *
##             67) texture_worst>=4.190306 13   0 B (0.00000000 1.00000000) *
##         17) smoothness_worst< -1.637109 12   0 B (0.00000000 1.00000000) *
##        9) smoothness_mean>=-2.262885 54  18 B (0.33333333 0.66666667)  
##         18) smoothness_mean>=-2.175971 24   7 M (0.70833333 0.29166667)  
##           36) compactness_se>=-3.668604 18   1 M (0.94444444 0.05555556)  
##             72) texture_worst>=3.952268 17   0 M (1.00000000 0.00000000) *
##             73) texture_worst< 3.952268 1   0 B (0.00000000 1.00000000) *
##           37) compactness_se< -3.668604 6   0 B (0.00000000 1.00000000) *
##         19) smoothness_mean< -2.175971 30   1 B (0.03333333 0.96666667)  
##           38) texture_worst< 4.036973 1   0 M (1.00000000 0.00000000) *
##           39) texture_worst>=4.036973 29   0 B (0.00000000 1.00000000) *
##      5) texture_worst>=4.467083 611 262 B (0.42880524 0.57119476)  
##       10) symmetry_worst>=-1.549706 115  45 M (0.60869565 0.39130435)  
##         20) compactness_se< -3.180898 105  36 M (0.65714286 0.34285714)  
##           40) texture_worst>=4.61159 75  17 M (0.77333333 0.22666667)  
##             80) compactness_se>=-4.694501 72  14 M (0.80555556 0.19444444) *
##             81) compactness_se< -4.694501 3   0 B (0.00000000 1.00000000) *
##           41) texture_worst< 4.61159 30  11 B (0.36666667 0.63333333)  
##             82) texture_mean>=2.950145 13   2 M (0.84615385 0.15384615) *
##             83) texture_mean< 2.950145 17   0 B (0.00000000 1.00000000) *
##         21) compactness_se>=-3.180898 10   1 B (0.10000000 0.90000000)  
##           42) smoothness_mean>=-2.346429 1   0 M (1.00000000 0.00000000) *
##           43) smoothness_mean< -2.346429 9   0 B (0.00000000 1.00000000) *
##       11) symmetry_worst< -1.549706 496 192 B (0.38709677 0.61290323)  
##         22) texture_mean>=2.929857 400 173 B (0.43250000 0.56750000)  
##           44) symmetry_worst< -1.606972 355 167 B (0.47042254 0.52957746)  
##             88) smoothness_worst< -1.438548 319 157 M (0.50783699 0.49216301) *
##             89) smoothness_worst>=-1.438548 36   5 B (0.13888889 0.86111111) *
##           45) symmetry_worst>=-1.606972 45   6 B (0.13333333 0.86666667)  
##             90) texture_worst< 4.581608 4   0 M (1.00000000 0.00000000) *
##             91) texture_worst>=4.581608 41   2 B (0.04878049 0.95121951) *
##         23) texture_mean< 2.929857 96  19 B (0.19791667 0.80208333)  
##           46) texture_worst< 4.545141 36  15 B (0.41666667 0.58333333)  
##             92) texture_worst>=4.508732 16   1 M (0.93750000 0.06250000) *
##             93) texture_worst< 4.508732 20   0 B (0.00000000 1.00000000) *
##           47) texture_worst>=4.545141 60   4 B (0.06666667 0.93333333)  
##             94) compactness_se< -4.391048 10   4 B (0.40000000 0.60000000) *
##             95) compactness_se>=-4.391048 50   0 B (0.00000000 1.00000000) *
##    3) texture_mean< 2.708379 42   5 B (0.11904762 0.88095238)  
##      6) symmetry_worst>=-1.552505 13   5 B (0.38461538 0.61538462)  
##       12) texture_mean< 2.518783 4   0 M (1.00000000 0.00000000) *
##       13) texture_mean>=2.518783 9   1 B (0.11111111 0.88888889)  
##         26) compactness_se>=-3.3026 1   0 M (1.00000000 0.00000000) *
##         27) compactness_se< -3.3026 8   0 B (0.00000000 1.00000000) *
##      7) symmetry_worst< -1.552505 29   0 B (0.00000000 1.00000000) *
## 
## $trees[[49]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 363 B (0.39802632 0.60197368)  
##     2) smoothness_worst>=-1.501069 361 176 B (0.48753463 0.51246537)  
##       4) smoothness_worst< -1.476605 126  46 M (0.63492063 0.36507937)  
##         8) smoothness_worst>=-1.482699 47   5 M (0.89361702 0.10638298)  
##          16) texture_worst>=4.136746 45   3 M (0.93333333 0.06666667)  
##            32) smoothness_mean< -2.241789 36   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean>=-2.241789 9   3 M (0.66666667 0.33333333)  
##              66) texture_mean>=2.858739 6   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 2.858739 3   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 4.136746 2   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.482699 79  38 B (0.48101266 0.51898734)  
##          18) smoothness_worst< -1.49223 41  11 M (0.73170732 0.26829268)  
##            36) symmetry_worst< -1.456355 37   7 M (0.81081081 0.18918919)  
##              72) smoothness_mean>=-2.374383 33   4 M (0.87878788 0.12121212) *
##              73) smoothness_mean< -2.374383 4   1 B (0.25000000 0.75000000) *
##            37) symmetry_worst>=-1.456355 4   0 B (0.00000000 1.00000000) *
##          19) smoothness_worst>=-1.49223 38   8 B (0.21052632 0.78947368)  
##            38) symmetry_worst>=-1.413975 3   0 M (1.00000000 0.00000000) *
##            39) symmetry_worst< -1.413975 35   5 B (0.14285714 0.85714286)  
##              78) texture_mean>=3.407548 2   0 M (1.00000000 0.00000000) *
##              79) texture_mean< 3.407548 33   3 B (0.09090909 0.90909091) *
##       5) smoothness_worst>=-1.476605 235  96 B (0.40851064 0.59148936)  
##        10) smoothness_worst>=-1.473476 193  95 B (0.49222798 0.50777202)  
##          20) smoothness_mean< -2.300091 52  13 M (0.75000000 0.25000000)  
##            40) smoothness_mean>=-2.362601 33   2 M (0.93939394 0.06060606)  
##              80) compactness_se>=-4.494315 31   0 M (1.00000000 0.00000000) *
##              81) compactness_se< -4.494315 2   0 B (0.00000000 1.00000000) *
##            41) smoothness_mean< -2.362601 19   8 B (0.42105263 0.57894737)  
##              82) symmetry_worst>=-1.446218 7   0 M (1.00000000 0.00000000) *
##              83) symmetry_worst< -1.446218 12   1 B (0.08333333 0.91666667) *
##          21) smoothness_mean>=-2.300091 141  56 B (0.39716312 0.60283688)  
##            42) compactness_se>=-4.02632 111  54 B (0.48648649 0.51351351)  
##              84) smoothness_mean>=-2.288684 89  35 M (0.60674157 0.39325843) *
##              85) smoothness_mean< -2.288684 22   0 B (0.00000000 1.00000000) *
##            43) compactness_se< -4.02632 30   2 B (0.06666667 0.93333333)  
##              86) symmetry_worst< -1.743442 5   2 B (0.40000000 0.60000000) *
##              87) symmetry_worst>=-1.743442 25   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.473476 42   1 B (0.02380952 0.97619048)  
##          22) texture_mean>=3.069079 1   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.069079 41   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst< -1.501069 551 187 B (0.33938294 0.66061706)  
##       6) texture_worst< 4.467472 160  77 M (0.51875000 0.48125000)  
##        12) compactness_se< -3.48221 132  51 M (0.61363636 0.38636364)  
##          24) texture_worst>=3.891616 118  37 M (0.68644068 0.31355932)  
##            48) compactness_se>=-4.519704 102  23 M (0.77450980 0.22549020)  
##              96) symmetry_worst< -1.576447 95  16 M (0.83157895 0.16842105) *
##              97) symmetry_worst>=-1.576447 7   0 B (0.00000000 1.00000000) *
##            49) compactness_se< -4.519704 16   2 B (0.12500000 0.87500000)  
##              98) smoothness_mean>=-2.306694 2   0 M (1.00000000 0.00000000) *
##              99) smoothness_mean< -2.306694 14   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 3.891616 14   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-3.48221 28   2 B (0.07142857 0.92857143)  
##          26) texture_worst>=4.411908 2   0 M (1.00000000 0.00000000) *
##          27) texture_worst< 4.411908 26   0 B (0.00000000 1.00000000) *
##       7) texture_worst>=4.467472 391 104 B (0.26598465 0.73401535)  
##        14) compactness_se>=-3.005655 20   6 M (0.70000000 0.30000000)  
##          28) texture_mean>=3.058386 14   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 3.058386 6   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -3.005655 371  90 B (0.24258760 0.75741240)  
##          30) compactness_se< -4.032373 140  55 B (0.39285714 0.60714286)  
##            60) texture_worst>=5.149489 19   3 M (0.84210526 0.15789474)  
##             120) symmetry_worst>=-2.036673 16   0 M (1.00000000 0.00000000) *
##             121) symmetry_worst< -2.036673 3   0 B (0.00000000 1.00000000) *
##            61) texture_worst< 5.149489 121  39 B (0.32231405 0.67768595)  
##             122) smoothness_worst< -1.607486 21   6 M (0.71428571 0.28571429) *
##             123) smoothness_worst>=-1.607486 100  24 B (0.24000000 0.76000000) *
##          31) compactness_se>=-4.032373 231  35 B (0.15151515 0.84848485)  
##            62) symmetry_worst>=-1.789477 91  24 B (0.26373626 0.73626374)  
##             124) symmetry_worst< -1.767566 8   0 M (1.00000000 0.00000000) *
##             125) symmetry_worst>=-1.767566 83  16 B (0.19277108 0.80722892) *
##            63) symmetry_worst< -1.789477 140  11 B (0.07857143 0.92142857)  
##             126) texture_mean>=3.313386 9   4 M (0.55555556 0.44444444) *
##             127) texture_mean< 3.313386 131   6 B (0.04580153 0.95419847) *
## 
## $trees[[50]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 372 B (0.40789474 0.59210526)  
##     2) symmetry_worst>=-1.775265 446 215 M (0.51793722 0.48206278)  
##       4) texture_mean>=2.920739 316 125 M (0.60443038 0.39556962)  
##         8) texture_worst< 4.930927 249  82 M (0.67068273 0.32931727)  
##          16) compactness_se< -4.07887 78   9 M (0.88461538 0.11538462)  
##            32) symmetry_worst< -1.51291 64   2 M (0.96875000 0.03125000)  
##              64) smoothness_mean< -2.306529 59   0 M (1.00000000 0.00000000) *
##              65) smoothness_mean>=-2.306529 5   2 M (0.60000000 0.40000000) *
##            33) symmetry_worst>=-1.51291 14   7 M (0.50000000 0.50000000)  
##              66) texture_mean>=2.99247 7   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 2.99247 7   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-4.07887 171  73 M (0.57309942 0.42690058)  
##            34) compactness_se>=-3.681558 102  26 M (0.74509804 0.25490196)  
##              68) compactness_se< -3.502612 41   0 M (1.00000000 0.00000000) *
##              69) compactness_se>=-3.502612 61  26 M (0.57377049 0.42622951) *
##            35) compactness_se< -3.681558 69  22 B (0.31884058 0.68115942)  
##              70) smoothness_worst>=-1.462341 24   6 M (0.75000000 0.25000000) *
##              71) smoothness_worst< -1.462341 45   4 B (0.08888889 0.91111111) *
##         9) texture_worst>=4.930927 67  24 B (0.35820896 0.64179104)  
##          18) texture_worst>=5.003123 33  11 M (0.66666667 0.33333333)  
##            36) texture_mean< 3.225651 14   0 M (1.00000000 0.00000000) *
##            37) texture_mean>=3.225651 19   8 B (0.42105263 0.57894737)  
##              74) smoothness_mean>=-2.329526 5   0 M (1.00000000 0.00000000) *
##              75) smoothness_mean< -2.329526 14   3 B (0.21428571 0.78571429) *
##          19) texture_worst< 5.003123 34   2 B (0.05882353 0.94117647)  
##            38) symmetry_worst>=-1.450896 1   0 M (1.00000000 0.00000000) *
##            39) symmetry_worst< -1.450896 33   1 B (0.03030303 0.96969697)  
##              78) texture_mean< 3.088538 4   1 B (0.25000000 0.75000000) *
##              79) texture_mean>=3.088538 29   0 B (0.00000000 1.00000000) *
##       5) texture_mean< 2.920739 130  40 B (0.30769231 0.69230769)  
##        10) texture_mean< 2.850705 92  39 B (0.42391304 0.57608696)  
##          20) texture_mean>=2.824054 26   0 M (1.00000000 0.00000000) *
##          21) texture_mean< 2.824054 66  13 B (0.19696970 0.80303030)  
##            42) smoothness_worst>=-1.491834 34  13 B (0.38235294 0.61764706)  
##              84) texture_worst>=4.110502 14   5 M (0.64285714 0.35714286) *
##              85) texture_worst< 4.110502 20   4 B (0.20000000 0.80000000) *
##            43) smoothness_worst< -1.491834 32   0 B (0.00000000 1.00000000) *
##        11) texture_mean>=2.850705 38   1 B (0.02631579 0.97368421)  
##          22) smoothness_mean>=-2.213964 1   0 M (1.00000000 0.00000000) *
##          23) smoothness_mean< -2.213964 37   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.775265 466 141 B (0.30257511 0.69742489)  
##       6) smoothness_worst>=-1.603315 406 138 B (0.33990148 0.66009852)  
##        12) smoothness_worst< -1.602623 10   0 M (1.00000000 0.00000000) *
##        13) smoothness_worst>=-1.602623 396 128 B (0.32323232 0.67676768)  
##          26) compactness_se< -3.869459 175  75 B (0.42857143 0.57142857)  
##            52) texture_worst>=4.907333 33   6 M (0.81818182 0.18181818)  
##             104) texture_mean< 3.353705 30   3 M (0.90000000 0.10000000) *
##             105) texture_mean>=3.353705 3   0 B (0.00000000 1.00000000) *
##            53) texture_worst< 4.907333 142  48 B (0.33802817 0.66197183)  
##             106) texture_worst< 4.751011 109  48 B (0.44036697 0.55963303) *
##             107) texture_worst>=4.751011 33   0 B (0.00000000 1.00000000) *
##          27) compactness_se>=-3.869459 221  53 B (0.23981900 0.76018100)  
##            54) smoothness_mean< -2.473552 12   1 M (0.91666667 0.08333333)  
##             108) smoothness_mean>=-2.497829 11   0 M (1.00000000 0.00000000) *
##             109) smoothness_mean< -2.497829 1   0 B (0.00000000 1.00000000) *
##            55) smoothness_mean>=-2.473552 209  42 B (0.20095694 0.79904306)  
##             110) smoothness_mean>=-2.377849 129  42 B (0.32558140 0.67441860) *
##             111) smoothness_mean< -2.377849 80   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.603315 60   3 B (0.05000000 0.95000000)  
##        14) compactness_se>=-2.951614 1   0 M (1.00000000 0.00000000) *
##        15) compactness_se< -2.951614 59   2 B (0.03389831 0.96610169)  
##          30) compactness_se< -4.691681 6   2 B (0.33333333 0.66666667)  
##            60) compactness_se>=-4.711555 2   0 M (1.00000000 0.00000000) *
##            61) compactness_se< -4.711555 4   0 B (0.00000000 1.00000000) *
##          31) compactness_se>=-4.691681 53   0 B (0.00000000 1.00000000) *
## 
## $trees[[51]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 388 B (0.42543860 0.57456140)  
##     2) texture_mean>=2.853016 706 326 B (0.46175637 0.53824363)  
##       4) smoothness_worst>=-1.556752 499 243 M (0.51302605 0.48697395)  
##         8) smoothness_mean< -2.349264 199  67 M (0.66331658 0.33668342)  
##          16) compactness_se< -3.938851 115  20 M (0.82608696 0.17391304)  
##            32) smoothness_mean>=-2.473387 103   8 M (0.92233010 0.07766990)  
##              64) symmetry_worst>=-1.959872 83   2 M (0.97590361 0.02409639) *
##              65) symmetry_worst< -1.959872 20   6 M (0.70000000 0.30000000) *
##            33) smoothness_mean< -2.473387 12   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-3.938851 84  37 B (0.44047619 0.55952381)  
##            34) smoothness_mean>=-2.362071 15   0 M (1.00000000 0.00000000) *
##            35) smoothness_mean< -2.362071 69  22 B (0.31884058 0.68115942)  
##              70) smoothness_mean< -2.461054 12   0 M (1.00000000 0.00000000) *
##              71) smoothness_mean>=-2.461054 57  10 B (0.17543860 0.82456140) *
##         9) smoothness_mean>=-2.349264 300 124 B (0.41333333 0.58666667)  
##          18) compactness_se>=-3.219881 34   8 M (0.76470588 0.23529412)  
##            36) smoothness_mean>=-2.332581 30   4 M (0.86666667 0.13333333)  
##              72) smoothness_mean< -2.224699 21   0 M (1.00000000 0.00000000) *
##              73) smoothness_mean>=-2.224699 9   4 M (0.55555556 0.44444444) *
##            37) smoothness_mean< -2.332581 4   0 B (0.00000000 1.00000000) *
##          19) compactness_se< -3.219881 266  98 B (0.36842105 0.63157895)  
##            38) texture_mean< 3.039744 166  76 B (0.45783133 0.54216867)  
##              76) compactness_se>=-3.669769 54  15 M (0.72222222 0.27777778) *
##              77) compactness_se< -3.669769 112  37 B (0.33035714 0.66964286) *
##            39) texture_mean>=3.039744 100  22 B (0.22000000 0.78000000)  
##              78) texture_worst< 4.664833 5   0 M (1.00000000 0.00000000) *
##              79) texture_worst>=4.664833 95  17 B (0.17894737 0.82105263) *
##       5) smoothness_worst< -1.556752 207  70 B (0.33816425 0.66183575)  
##        10) symmetry_worst< -2.242382 22   3 M (0.86363636 0.13636364)  
##          20) smoothness_worst>=-1.645584 16   0 M (1.00000000 0.00000000) *
##          21) smoothness_worst< -1.645584 6   3 M (0.50000000 0.50000000)  
##            42) texture_mean>=3.175045 3   0 M (1.00000000 0.00000000) *
##            43) texture_mean< 3.175045 3   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst>=-2.242382 185  51 B (0.27567568 0.72432432)  
##          22) smoothness_worst< -1.568787 127  45 B (0.35433071 0.64566929)  
##            44) smoothness_worst>=-1.584838 24   5 M (0.79166667 0.20833333)  
##              88) texture_mean>=2.924461 19   0 M (1.00000000 0.00000000) *
##              89) texture_mean< 2.924461 5   0 B (0.00000000 1.00000000) *
##            45) smoothness_worst< -1.584838 103  26 B (0.25242718 0.74757282)  
##              90) symmetry_worst>=-1.795801 43  21 B (0.48837209 0.51162791) *
##              91) symmetry_worst< -1.795801 60   5 B (0.08333333 0.91666667) *
##          23) smoothness_worst>=-1.568787 58   6 B (0.10344828 0.89655172)  
##            46) smoothness_mean>=-2.299648 2   0 M (1.00000000 0.00000000) *
##            47) smoothness_mean< -2.299648 56   4 B (0.07142857 0.92857143)  
##              94) compactness_se>=-2.682598 2   0 M (1.00000000 0.00000000) *
##              95) compactness_se< -2.682598 54   2 B (0.03703704 0.96296296) *
##     3) texture_mean< 2.853016 206  62 B (0.30097087 0.69902913)  
##       6) smoothness_worst< -1.468426 135  54 B (0.40000000 0.60000000)  
##        12) smoothness_mean>=-2.443746 112  54 B (0.48214286 0.51785714)  
##          24) texture_mean>=2.656405 99  45 M (0.54545455 0.45454545)  
##            48) texture_mean< 2.744378 30   3 M (0.90000000 0.10000000)  
##              96) smoothness_mean< -2.205363 28   1 M (0.96428571 0.03571429) *
##              97) smoothness_mean>=-2.205363 2   0 B (0.00000000 1.00000000) *
##            49) texture_mean>=2.744378 69  27 B (0.39130435 0.60869565)  
##              98) smoothness_mean< -2.437331 7   0 M (1.00000000 0.00000000) *
##              99) smoothness_mean>=-2.437331 62  20 B (0.32258065 0.67741935) *
##          25) texture_mean< 2.656405 13   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.443746 23   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst>=-1.468426 71   8 B (0.11267606 0.88732394)  
##        14) texture_worst>=4.398698 3   0 M (1.00000000 0.00000000) *
##        15) texture_worst< 4.398698 68   5 B (0.07352941 0.92647059)  
##          30) symmetry_worst>=-1.619683 32   5 B (0.15625000 0.84375000)  
##            60) symmetry_worst< -1.483416 6   2 M (0.66666667 0.33333333)  
##             120) compactness_se>=-3.961508 4   0 M (1.00000000 0.00000000) *
##             121) compactness_se< -3.961508 2   0 B (0.00000000 1.00000000) *
##            61) symmetry_worst>=-1.483416 26   1 B (0.03846154 0.96153846)  
##             122) smoothness_worst< -1.454202 1   0 M (1.00000000 0.00000000) *
##             123) smoothness_worst>=-1.454202 25   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst< -1.619683 36   0 B (0.00000000 1.00000000) *
## 
## $trees[[52]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 448 M (0.50877193 0.49122807)  
##     2) compactness_se< -3.867535 482 199 M (0.58713693 0.41286307)  
##       4) texture_mean>=2.754924 447 164 M (0.63310962 0.36689038)  
##         8) compactness_se>=-3.883925 38   0 M (1.00000000 0.00000000) *
##         9) compactness_se< -3.883925 409 164 M (0.59902200 0.40097800)  
##          18) symmetry_worst>=-1.966052 318 105 M (0.66981132 0.33018868)  
##            36) compactness_se< -3.902076 303  90 M (0.70297030 0.29702970)  
##              72) smoothness_mean< -2.300091 240  56 M (0.76666667 0.23333333) *
##              73) smoothness_mean>=-2.300091 63  29 B (0.46031746 0.53968254) *
##            37) compactness_se>=-3.902076 15   0 B (0.00000000 1.00000000) *
##          19) symmetry_worst< -1.966052 91  32 B (0.35164835 0.64835165)  
##            38) symmetry_worst< -2.170754 31  10 M (0.67741935 0.32258065)  
##              76) smoothness_mean>=-2.392268 24   3 M (0.87500000 0.12500000) *
##              77) smoothness_mean< -2.392268 7   0 B (0.00000000 1.00000000) *
##            39) symmetry_worst>=-2.170754 60  11 B (0.18333333 0.81666667)  
##              78) smoothness_worst>=-1.525709 15   5 M (0.66666667 0.33333333) *
##              79) smoothness_worst< -1.525709 45   1 B (0.02222222 0.97777778) *
##       5) texture_mean< 2.754924 35   0 B (0.00000000 1.00000000) *
##     3) compactness_se>=-3.867535 430 181 B (0.42093023 0.57906977)  
##       6) compactness_se>=-3.721197 345 165 B (0.47826087 0.52173913)  
##        12) compactness_se< -3.575734 77  21 M (0.72727273 0.27272727)  
##          24) smoothness_mean>=-2.423737 53   6 M (0.88679245 0.11320755)  
##            48) texture_mean>=2.647471 50   3 M (0.94000000 0.06000000)  
##              96) symmetry_worst>=-2.174989 44   0 M (1.00000000 0.00000000) *
##              97) symmetry_worst< -2.174989 6   3 M (0.50000000 0.50000000) *
##            49) texture_mean< 2.647471 3   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean< -2.423737 24   9 B (0.37500000 0.62500000)  
##            50) symmetry_worst>=-1.813961 10   1 M (0.90000000 0.10000000)  
##             100) smoothness_mean< -2.452956 9   0 M (1.00000000 0.00000000) *
##             101) smoothness_mean>=-2.452956 1   0 B (0.00000000 1.00000000) *
##            51) symmetry_worst< -1.813961 14   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-3.575734 268 109 B (0.40671642 0.59328358)  
##          26) texture_mean>=3.059388 81  29 M (0.64197531 0.35802469)  
##            52) texture_worst< 4.745147 17   0 M (1.00000000 0.00000000) *
##            53) texture_worst>=4.745147 64  29 M (0.54687500 0.45312500)  
##             106) texture_worst>=5.016194 23   5 M (0.78260870 0.21739130) *
##             107) texture_worst< 5.016194 41  17 B (0.41463415 0.58536585) *
##          27) texture_mean< 3.059388 187  57 B (0.30481283 0.69518717)  
##            54) smoothness_worst>=-1.502084 87  43 M (0.50574713 0.49425287)  
##             108) smoothness_worst< -1.476605 30   5 M (0.83333333 0.16666667) *
##             109) smoothness_worst>=-1.476605 57  19 B (0.33333333 0.66666667) *
##            55) smoothness_worst< -1.502084 100  13 B (0.13000000 0.87000000)  
##             110) texture_mean< 2.782752 15   6 M (0.60000000 0.40000000) *
##             111) texture_mean>=2.782752 85   4 B (0.04705882 0.95294118) *
##       7) compactness_se< -3.721197 85  16 B (0.18823529 0.81176471)  
##        14) smoothness_worst>=-1.48132 27  11 M (0.59259259 0.40740741)  
##          28) texture_mean>=2.971675 12   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.971675 15   4 B (0.26666667 0.73333333)  
##            58) symmetry_worst>=-1.612049 4   0 M (1.00000000 0.00000000) *
##            59) symmetry_worst< -1.612049 11   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst< -1.48132 58   0 B (0.00000000 1.00000000) *
## 
## $trees[[53]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 M (0.51535088 0.48464912)  
##     2) smoothness_worst>=-1.559144 682 299 M (0.56158358 0.43841642)  
##       4) symmetry_worst>=-2.233349 648 269 M (0.58487654 0.41512346)  
##         8) smoothness_mean< -2.26529 479 174 M (0.63674322 0.36325678)  
##          16) smoothness_mean>=-2.416986 388 125 M (0.67783505 0.32216495)  
##            32) symmetry_worst>=-2.207519 379 116 M (0.69393140 0.30606860)  
##              64) texture_mean< 3.36829 370 108 M (0.70810811 0.29189189) *
##              65) texture_mean>=3.36829 9   1 B (0.11111111 0.88888889) *
##            33) symmetry_worst< -2.207519 9   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.416986 91  42 B (0.46153846 0.53846154)  
##            34) smoothness_worst< -1.551775 35   7 M (0.80000000 0.20000000)  
##              68) texture_mean>=2.859755 31   3 M (0.90322581 0.09677419) *
##              69) texture_mean< 2.859755 4   0 B (0.00000000 1.00000000) *
##            35) smoothness_worst>=-1.551775 56  14 B (0.25000000 0.75000000)  
##              70) texture_mean>=3.111958 22   8 M (0.63636364 0.36363636) *
##              71) texture_mean< 3.111958 34   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean>=-2.26529 169  74 B (0.43786982 0.56213018)  
##          18) smoothness_mean>=-2.222851 92  38 M (0.58695652 0.41304348)  
##            36) smoothness_mean< -2.093138 73  23 M (0.68493151 0.31506849)  
##              72) compactness_se>=-3.673673 31   1 M (0.96774194 0.03225806) *
##              73) compactness_se< -3.673673 42  20 B (0.47619048 0.52380952) *
##            37) smoothness_mean>=-2.093138 19   4 B (0.21052632 0.78947368)  
##              74) symmetry_worst>=-1.596878 7   3 M (0.57142857 0.42857143) *
##              75) symmetry_worst< -1.596878 12   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean< -2.222851 77  20 B (0.25974026 0.74025974)  
##            38) smoothness_mean< -2.235394 41  18 B (0.43902439 0.56097561)  
##              76) smoothness_worst>=-1.45841 21   5 M (0.76190476 0.23809524) *
##              77) smoothness_worst< -1.45841 20   2 B (0.10000000 0.90000000) *
##            39) smoothness_mean>=-2.235394 36   2 B (0.05555556 0.94444444)  
##              78) texture_mean< 2.693961 2   0 M (1.00000000 0.00000000) *
##              79) texture_mean>=2.693961 34   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst< -2.233349 34   4 B (0.11764706 0.88235294)  
##        10) compactness_se>=-2.833542 4   0 M (1.00000000 0.00000000) *
##        11) compactness_se< -2.833542 30   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst< -1.559144 230  87 B (0.37826087 0.62173913)  
##       6) smoothness_worst< -1.565486 196  86 B (0.43877551 0.56122449)  
##        12) texture_worst>=4.354728 158  79 M (0.50000000 0.50000000)  
##          24) symmetry_worst>=-1.720706 56  17 M (0.69642857 0.30357143)  
##            48) texture_mean>=2.958609 48   9 M (0.81250000 0.18750000)  
##              96) smoothness_worst>=-1.660611 45   6 M (0.86666667 0.13333333) *
##              97) smoothness_worst< -1.660611 3   0 B (0.00000000 1.00000000) *
##            49) texture_mean< 2.958609 8   0 B (0.00000000 1.00000000) *
##          25) symmetry_worst< -1.720706 102  40 B (0.39215686 0.60784314)  
##            50) compactness_se>=-4.104699 50  20 M (0.60000000 0.40000000)  
##             100) texture_worst>=4.56463 33   5 M (0.84848485 0.15151515) *
##             101) texture_worst< 4.56463 17   2 B (0.11764706 0.88235294) *
##            51) compactness_se< -4.104699 52  10 B (0.19230769 0.80769231)  
##             102) symmetry_worst< -2.382417 8   0 M (1.00000000 0.00000000) *
##             103) symmetry_worst>=-2.382417 44   2 B (0.04545455 0.95454545) *
##        13) texture_worst< 4.354728 38   7 B (0.18421053 0.81578947)  
##          26) smoothness_mean>=-2.30797 5   0 M (1.00000000 0.00000000) *
##          27) smoothness_mean< -2.30797 33   2 B (0.06060606 0.93939394)  
##            54) texture_worst< 3.948691 5   2 B (0.40000000 0.60000000)  
##             108) texture_mean>=2.754513 2   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 2.754513 3   0 B (0.00000000 1.00000000) *
##            55) texture_worst>=3.948691 28   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst>=-1.565486 34   1 B (0.02941176 0.97058824)  
##        14) smoothness_mean>=-2.299648 1   0 M (1.00000000 0.00000000) *
##        15) smoothness_mean< -2.299648 33   0 B (0.00000000 1.00000000) *
## 
## $trees[[54]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 389 B (0.42653509 0.57346491)  
##     2) texture_worst>=4.275472 772 357 B (0.46243523 0.53756477)  
##       4) symmetry_worst>=-2.041024 641 313 M (0.51170047 0.48829953)  
##         8) smoothness_mean>=-2.21595 79  19 M (0.75949367 0.24050633)  
##          16) symmetry_worst>=-1.766269 52   3 M (0.94230769 0.05769231)  
##            32) texture_mean< 3.039982 42   0 M (1.00000000 0.00000000) *
##            33) texture_mean>=3.039982 10   3 M (0.70000000 0.30000000)  
##              66) texture_mean>=3.045947 7   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 3.045947 3   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst< -1.766269 27  11 B (0.40740741 0.59259259)  
##            34) symmetry_worst< -1.891461 10   1 M (0.90000000 0.10000000)  
##              68) texture_mean>=2.967292 9   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 2.967292 1   0 B (0.00000000 1.00000000) *
##            35) symmetry_worst>=-1.891461 17   2 B (0.11764706 0.88235294)  
##              70) compactness_se>=-3.317826 2   0 M (1.00000000 0.00000000) *
##              71) compactness_se< -3.317826 15   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.21595 562 268 B (0.47686833 0.52313167)  
##          18) texture_worst< 4.280533 24   2 M (0.91666667 0.08333333)  
##            36) smoothness_mean>=-2.473491 22   0 M (1.00000000 0.00000000) *
##            37) smoothness_mean< -2.473491 2   0 B (0.00000000 1.00000000) *
##          19) texture_worst>=4.280533 538 246 B (0.45724907 0.54275093)  
##            38) texture_worst>=4.517889 412 201 M (0.51213592 0.48786408)  
##              76) texture_worst< 4.644679 148  49 M (0.66891892 0.33108108) *
##              77) texture_worst>=4.644679 264 112 B (0.42424242 0.57575758) *
##            39) texture_worst< 4.517889 126  35 B (0.27777778 0.72222222)  
##              78) texture_mean>=2.960623 27   5 M (0.81481481 0.18518519) *
##              79) texture_mean< 2.960623 99  13 B (0.13131313 0.86868687) *
##       5) symmetry_worst< -2.041024 131  29 B (0.22137405 0.77862595)  
##        10) symmetry_worst< -2.379234 15   4 M (0.73333333 0.26666667)  
##          20) texture_mean< 3.283931 12   1 M (0.91666667 0.08333333)  
##            40) smoothness_mean< -2.287736 10   0 M (1.00000000 0.00000000) *
##            41) smoothness_mean>=-2.287736 2   1 M (0.50000000 0.50000000)  
##              82) texture_mean>=3.050671 1   0 M (1.00000000 0.00000000) *
##              83) texture_mean< 3.050671 1   0 B (0.00000000 1.00000000) *
##          21) texture_mean>=3.283931 3   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst>=-2.379234 116  18 B (0.15517241 0.84482759)  
##          22) compactness_se>=-2.72933 5   1 M (0.80000000 0.20000000)  
##            44) texture_mean>=3.063909 4   0 M (1.00000000 0.00000000) *
##            45) texture_mean< 3.063909 1   0 B (0.00000000 1.00000000) *
##          23) compactness_se< -2.72933 111  14 B (0.12612613 0.87387387)  
##            46) smoothness_worst< -1.709211 2   0 M (1.00000000 0.00000000) *
##            47) smoothness_worst>=-1.709211 109  12 B (0.11009174 0.88990826)  
##              94) smoothness_worst>=-1.448989 1   0 M (1.00000000 0.00000000) *
##              95) smoothness_worst< -1.448989 108  11 B (0.10185185 0.89814815) *
##     3) texture_worst< 4.275472 140  32 B (0.22857143 0.77142857)  
##       6) texture_mean>=2.757473 65  23 B (0.35384615 0.64615385)  
##        12) texture_worst< 4.012259 10   0 M (1.00000000 0.00000000) *
##        13) texture_worst>=4.012259 55  13 B (0.23636364 0.76363636)  
##          26) texture_mean< 2.760642 7   0 M (1.00000000 0.00000000) *
##          27) texture_mean>=2.760642 48   6 B (0.12500000 0.87500000)  
##            54) symmetry_worst>=-1.431518 8   2 M (0.75000000 0.25000000)  
##             108) texture_mean< 2.914443 6   0 M (1.00000000 0.00000000) *
##             109) texture_mean>=2.914443 2   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst< -1.431518 40   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 2.757473 75   9 B (0.12000000 0.88000000)  
##        14) texture_mean< 2.715026 38   9 B (0.23684211 0.76315789)  
##          28) texture_mean>=2.709047 4   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.709047 34   5 B (0.14705882 0.85294118)  
##            58) texture_mean< 2.515298 6   3 M (0.50000000 0.50000000)  
##             116) smoothness_mean< -2.060513 3   0 M (1.00000000 0.00000000) *
##             117) smoothness_mean>=-2.060513 3   0 B (0.00000000 1.00000000) *
##            59) texture_mean>=2.515298 28   2 B (0.07142857 0.92857143)  
##             118) smoothness_mean< -2.298096 6   2 B (0.33333333 0.66666667) *
##             119) smoothness_mean>=-2.298096 22   0 B (0.00000000 1.00000000) *
##        15) texture_mean>=2.715026 37   0 B (0.00000000 1.00000000) *
## 
## $trees[[55]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 448 B (0.49122807 0.50877193)  
##     2) texture_worst>=4.580648 486 191 M (0.60699588 0.39300412)  
##       4) smoothness_mean>=-2.408446 368 116 M (0.68478261 0.31521739)  
##         8) texture_worst>=4.895983 161  31 M (0.80745342 0.19254658)  
##          16) smoothness_mean>=-2.336091 120  10 M (0.91666667 0.08333333)  
##            32) smoothness_mean< -2.106736 115   6 M (0.94782609 0.05217391)  
##              64) compactness_se>=-4.032549 109   2 M (0.98165138 0.01834862) *
##              65) compactness_se< -4.032549 6   2 B (0.33333333 0.66666667) *
##            33) smoothness_mean>=-2.106736 5   1 B (0.20000000 0.80000000)  
##              66) texture_mean< 3.181902 1   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=3.181902 4   0 B (0.00000000 1.00000000) *
##          17) smoothness_mean< -2.336091 41  20 B (0.48780488 0.51219512)  
##            34) smoothness_worst< -1.530302 14   1 M (0.92857143 0.07142857)  
##              68) texture_mean< 3.392124 13   0 M (1.00000000 0.00000000) *
##              69) texture_mean>=3.392124 1   0 B (0.00000000 1.00000000) *
##            35) smoothness_worst>=-1.530302 27   7 B (0.25925926 0.74074074)  
##              70) texture_worst< 5.113166 6   0 M (1.00000000 0.00000000) *
##              71) texture_worst>=5.113166 21   1 B (0.04761905 0.95238095) *
##         9) texture_worst< 4.895983 207  85 M (0.58937198 0.41062802)  
##          18) smoothness_mean< -2.352368 82  10 M (0.87804878 0.12195122)  
##            36) texture_worst< 4.876647 76   4 M (0.94736842 0.05263158)  
##              72) texture_mean< 3.135225 71   0 M (1.00000000 0.00000000) *
##              73) texture_mean>=3.135225 5   1 B (0.20000000 0.80000000) *
##            37) texture_worst>=4.876647 6   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean>=-2.352368 125  50 B (0.40000000 0.60000000)  
##            38) texture_worst< 4.608306 17   0 M (1.00000000 0.00000000) *
##            39) texture_worst>=4.608306 108  33 B (0.30555556 0.69444444)  
##              78) symmetry_worst>=-1.550826 39  14 M (0.64102564 0.35897436) *
##              79) symmetry_worst< -1.550826 69   8 B (0.11594203 0.88405797) *
##       5) smoothness_mean< -2.408446 118  43 B (0.36440678 0.63559322)  
##        10) smoothness_mean< -2.443746 85  39 B (0.45882353 0.54117647)  
##          20) smoothness_mean>=-2.489159 45  15 M (0.66666667 0.33333333)  
##            40) texture_mean>=2.991714 34   7 M (0.79411765 0.20588235)  
##              80) texture_worst< 5.316369 27   2 M (0.92592593 0.07407407) *
##              81) texture_worst>=5.316369 7   2 B (0.28571429 0.71428571) *
##            41) texture_mean< 2.991714 11   3 B (0.27272727 0.72727273)  
##              82) smoothness_mean>=-2.457256 3   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean< -2.457256 8   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean< -2.489159 40   9 B (0.22500000 0.77500000)  
##            42) texture_mean< 2.966301 5   0 M (1.00000000 0.00000000) *
##            43) texture_mean>=2.966301 35   4 B (0.11428571 0.88571429)  
##              86) symmetry_worst>=-1.695215 11   4 B (0.36363636 0.63636364) *
##              87) symmetry_worst< -1.695215 24   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean>=-2.443746 33   4 B (0.12121212 0.87878788)  
##          22) texture_worst< 4.592462 3   0 M (1.00000000 0.00000000) *
##          23) texture_worst>=4.592462 30   1 B (0.03333333 0.96666667)  
##            46) texture_worst>=5.149193 4   1 B (0.25000000 0.75000000)  
##              92) texture_mean>=3.032025 1   0 M (1.00000000 0.00000000) *
##              93) texture_mean< 3.032025 3   0 B (0.00000000 1.00000000) *
##            47) texture_worst< 5.149193 26   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.580648 426 153 B (0.35915493 0.64084507)  
##       6) smoothness_worst>=-1.496637 178  87 B (0.48876404 0.51123596)  
##        12) smoothness_worst< -1.476801 56  11 M (0.80357143 0.19642857)  
##          24) smoothness_mean>=-2.315133 51   6 M (0.88235294 0.11764706)  
##            48) smoothness_mean< -2.275944 38   0 M (1.00000000 0.00000000) *
##            49) smoothness_mean>=-2.275944 13   6 M (0.53846154 0.46153846)  
##              98) texture_mean>=3.003189 7   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 3.003189 6   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean< -2.315133 5   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst>=-1.476801 122  42 B (0.34426230 0.65573770)  
##          26) smoothness_worst>=-1.451541 54  21 M (0.61111111 0.38888889)  
##            52) texture_mean>=2.798684 30   4 M (0.86666667 0.13333333)  
##             104) smoothness_mean< -2.155998 24   1 M (0.95833333 0.04166667) *
##             105) smoothness_mean>=-2.155998 6   3 M (0.50000000 0.50000000) *
##            53) texture_mean< 2.798684 24   7 B (0.29166667 0.70833333)  
##             106) symmetry_worst>=-1.232339 3   0 M (1.00000000 0.00000000) *
##             107) symmetry_worst< -1.232339 21   4 B (0.19047619 0.80952381) *
##          27) smoothness_worst< -1.451541 68   9 B (0.13235294 0.86764706)  
##            54) compactness_se>=-3.453499 12   4 M (0.66666667 0.33333333)  
##             108) texture_mean< 2.801271 8   0 M (1.00000000 0.00000000) *
##             109) texture_mean>=2.801271 4   0 B (0.00000000 1.00000000) *
##            55) compactness_se< -3.453499 56   1 B (0.01785714 0.98214286)  
##             110) texture_mean>=2.932513 1   0 M (1.00000000 0.00000000) *
##             111) texture_mean< 2.932513 55   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.496637 248  66 B (0.26612903 0.73387097)  
##        14) symmetry_worst>=-1.830253 146  56 B (0.38356164 0.61643836)  
##          28) symmetry_worst< -1.598517 113  55 B (0.48672566 0.51327434)  
##            56) compactness_se< -3.93685 48  12 M (0.75000000 0.25000000)  
##             112) smoothness_mean< -2.320044 43   7 M (0.83720930 0.16279070) *
##             113) smoothness_mean>=-2.320044 5   0 B (0.00000000 1.00000000) *
##            57) compactness_se>=-3.93685 65  19 B (0.29230769 0.70769231)  
##             114) smoothness_mean>=-2.384488 23   7 M (0.69565217 0.30434783) *
##             115) smoothness_mean< -2.384488 42   3 B (0.07142857 0.92857143) *
##          29) symmetry_worst>=-1.598517 33   1 B (0.03030303 0.96969697)  
##            58) texture_mean>=2.990556 1   0 M (1.00000000 0.00000000) *
##            59) texture_mean< 2.990556 32   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.830253 102  10 B (0.09803922 0.90196078)  
##          30) symmetry_worst< -2.880164 4   0 M (1.00000000 0.00000000) *
##          31) symmetry_worst>=-2.880164 98   6 B (0.06122449 0.93877551)  
##            62) compactness_se>=-2.988951 2   0 M (1.00000000 0.00000000) *
##            63) compactness_se< -2.988951 96   4 B (0.04166667 0.95833333)  
##             126) compactness_se< -4.49319 11   4 B (0.36363636 0.63636364) *
##             127) compactness_se>=-4.49319 85   0 B (0.00000000 1.00000000) *
## 
## $trees[[56]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 429 B (0.47039474 0.52960526)  
##     2) smoothness_worst>=-1.504307 393 171 M (0.56488550 0.43511450)  
##       4) smoothness_worst< -1.476801 130  30 M (0.76923077 0.23076923)  
##         8) compactness_se>=-4.245893 107  14 M (0.86915888 0.13084112)  
##          16) smoothness_mean>=-2.367658 97   8 M (0.91752577 0.08247423)  
##            32) symmetry_worst< -1.561818 77   2 M (0.97402597 0.02597403)  
##              64) texture_mean>=2.647471 76   1 M (0.98684211 0.01315789) *
##              65) texture_mean< 2.647471 1   0 B (0.00000000 1.00000000) *
##            33) symmetry_worst>=-1.561818 20   6 M (0.70000000 0.30000000)  
##              66) smoothness_worst>=-1.491834 11   0 M (1.00000000 0.00000000) *
##              67) smoothness_worst< -1.491834 9   3 B (0.33333333 0.66666667) *
##          17) smoothness_mean< -2.367658 10   4 B (0.40000000 0.60000000)  
##            34) smoothness_mean< -2.408063 5   1 M (0.80000000 0.20000000)  
##              68) texture_mean>=2.903025 4   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 2.903025 1   0 B (0.00000000 1.00000000) *
##            35) smoothness_mean>=-2.408063 5   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -4.245893 23   7 B (0.30434783 0.69565217)  
##          18) symmetry_worst< -1.688448 9   2 M (0.77777778 0.22222222)  
##            36) texture_mean>=2.800736 7   0 M (1.00000000 0.00000000) *
##            37) texture_mean< 2.800736 2   0 B (0.00000000 1.00000000) *
##          19) symmetry_worst>=-1.688448 14   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.476801 263 122 B (0.46387833 0.53612167)  
##        10) smoothness_mean>=-2.225626 67  17 M (0.74626866 0.25373134)  
##          20) symmetry_worst>=-1.766269 52   6 M (0.88461538 0.11538462)  
##            40) smoothness_worst< -1.396673 36   0 M (1.00000000 0.00000000) *
##            41) smoothness_worst>=-1.396673 16   6 M (0.62500000 0.37500000)  
##              82) symmetry_worst>=-1.616162 10   0 M (1.00000000 0.00000000) *
##              83) symmetry_worst< -1.616162 6   0 B (0.00000000 1.00000000) *
##          21) symmetry_worst< -1.766269 15   4 B (0.26666667 0.73333333)  
##            42) smoothness_worst< -1.465711 4   0 M (1.00000000 0.00000000) *
##            43) smoothness_worst>=-1.465711 11   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.225626 196  72 B (0.36734694 0.63265306)  
##          22) texture_worst>=4.866447 58  19 M (0.67241379 0.32758621)  
##            44) compactness_se>=-4.512898 47   8 M (0.82978723 0.17021277)  
##              88) compactness_se< -2.942351 43   4 M (0.90697674 0.09302326) *
##              89) compactness_se>=-2.942351 4   0 B (0.00000000 1.00000000) *
##            45) compactness_se< -4.512898 11   0 B (0.00000000 1.00000000) *
##          23) texture_worst< 4.866447 138  33 B (0.23913043 0.76086957)  
##            46) symmetry_worst>=-1.864441 104  33 B (0.31730769 0.68269231)  
##              92) texture_worst< 4.786713 78  32 B (0.41025641 0.58974359) *
##              93) texture_worst>=4.786713 26   1 B (0.03846154 0.96153846) *
##            47) symmetry_worst< -1.864441 34   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst< -1.504307 519 207 B (0.39884393 0.60115607)  
##       6) smoothness_worst< -1.533657 365 172 B (0.47123288 0.52876712)  
##        12) smoothness_worst>=-1.556752 113  42 M (0.62831858 0.37168142)  
##          24) compactness_se< -4.087687 51   8 M (0.84313725 0.15686275)  
##            48) smoothness_mean>=-2.469882 46   3 M (0.93478261 0.06521739)  
##              96) smoothness_worst< -1.539367 43   0 M (1.00000000 0.00000000) *
##              97) smoothness_worst>=-1.539367 3   0 B (0.00000000 1.00000000) *
##            49) smoothness_mean< -2.469882 5   0 B (0.00000000 1.00000000) *
##          25) compactness_se>=-4.087687 62  28 B (0.45161290 0.54838710)  
##            50) compactness_se>=-3.569872 30   4 M (0.86666667 0.13333333)  
##             100) symmetry_worst< -1.710027 23   0 M (1.00000000 0.00000000) *
##             101) symmetry_worst>=-1.710027 7   3 B (0.42857143 0.57142857) *
##            51) compactness_se< -3.569872 32   2 B (0.06250000 0.93750000)  
##             102) symmetry_worst>=-1.617937 3   1 M (0.66666667 0.33333333) *
##             103) symmetry_worst< -1.617937 29   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.556752 252 101 B (0.40079365 0.59920635)  
##          26) smoothness_worst< -1.563512 219 100 B (0.45662100 0.54337900)  
##            52) texture_worst< 3.981173 11   0 M (1.00000000 0.00000000) *
##            53) texture_worst>=3.981173 208  89 B (0.42788462 0.57211538)  
##             106) texture_worst>=4.609399 111  49 M (0.55855856 0.44144144) *
##             107) texture_worst< 4.609399 97  27 B (0.27835052 0.72164948) *
##          27) smoothness_worst>=-1.563512 33   1 B (0.03030303 0.96969697)  
##            54) texture_mean>=3.274729 1   0 M (1.00000000 0.00000000) *
##            55) texture_mean< 3.274729 32   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst>=-1.533657 154  35 B (0.22727273 0.77272727)  
##        14) smoothness_mean< -2.329341 69  31 B (0.44927536 0.55072464)  
##          28) smoothness_mean>=-2.369786 31   8 M (0.74193548 0.25806452)  
##            56) compactness_se>=-3.979062 27   4 M (0.85185185 0.14814815)  
##             112) compactness_se< -3.468609 23   0 M (1.00000000 0.00000000) *
##             113) compactness_se>=-3.468609 4   0 B (0.00000000 1.00000000) *
##            57) compactness_se< -3.979062 4   0 B (0.00000000 1.00000000) *
##          29) smoothness_mean< -2.369786 38   8 B (0.21052632 0.78947368)  
##            58) symmetry_worst>=-1.567876 8   3 M (0.62500000 0.37500000)  
##             116) smoothness_worst< -1.513087 5   0 M (1.00000000 0.00000000) *
##             117) smoothness_worst>=-1.513087 3   0 B (0.00000000 1.00000000) *
##            59) symmetry_worst< -1.567876 30   3 B (0.10000000 0.90000000)  
##             118) smoothness_mean< -2.438762 9   3 B (0.33333333 0.66666667) *
##             119) smoothness_mean>=-2.438762 21   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean>=-2.329341 85   4 B (0.04705882 0.95294118)  
##          30) texture_mean>=3.019196 26   4 B (0.15384615 0.84615385)  
##            60) smoothness_mean>=-2.257258 3   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean< -2.257258 23   1 B (0.04347826 0.95652174)  
##             122) texture_mean< 3.028188 1   0 M (1.00000000 0.00000000) *
##             123) texture_mean>=3.028188 22   0 B (0.00000000 1.00000000) *
##          31) texture_mean< 3.019196 59   0 B (0.00000000 1.00000000) *
## 
## $trees[[57]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 448 B (0.4912281 0.5087719)  
##    2) smoothness_mean>=-2.546123 876 432 M (0.5068493 0.4931507)  
##      4) compactness_se>=-4.704842 862 418 M (0.5150812 0.4849188)  
##        8) compactness_se< -4.098353 230  81 M (0.6478261 0.3521739)  
##         16) smoothness_mean< -2.289681 203  58 M (0.7142857 0.2857143)  
##           32) texture_worst>=4.362076 183  42 M (0.7704918 0.2295082)  
##             64) symmetry_worst< -1.484766 169  31 M (0.8165680 0.1834320) *
##             65) symmetry_worst>=-1.484766 14   3 B (0.2142857 0.7857143) *
##           33) texture_worst< 4.362076 20   4 B (0.2000000 0.8000000)  
##             66) compactness_se>=-4.193609 6   2 M (0.6666667 0.3333333) *
##             67) compactness_se< -4.193609 14   0 B (0.0000000 1.0000000) *
##         17) smoothness_mean>=-2.289681 27   4 B (0.1481481 0.8518519)  
##           34) smoothness_mean>=-2.222419 7   3 M (0.5714286 0.4285714)  
##             68) texture_mean>=2.892399 4   0 M (1.0000000 0.0000000) *
##             69) texture_mean< 2.892399 3   0 B (0.0000000 1.0000000) *
##           35) smoothness_mean< -2.222419 20   0 B (0.0000000 1.0000000) *
##        9) compactness_se>=-4.098353 632 295 B (0.4667722 0.5332278)  
##         18) compactness_se>=-4.05446 599 295 B (0.4924875 0.5075125)  
##           36) smoothness_worst>=-1.499656 287 110 M (0.6167247 0.3832753)  
##             72) smoothness_worst< -1.434076 203  63 M (0.6896552 0.3103448) *
##             73) smoothness_worst>=-1.434076 84  37 B (0.4404762 0.5595238) *
##           37) smoothness_worst< -1.499656 312 118 B (0.3782051 0.6217949)  
##             74) texture_worst>=4.569119 187  90 B (0.4812834 0.5187166) *
##             75) texture_worst< 4.569119 125  28 B (0.2240000 0.7760000) *
##         19) compactness_se< -4.05446 33   0 B (0.0000000 1.0000000) *
##      5) compactness_se< -4.704842 14   0 B (0.0000000 1.0000000) *
##    3) smoothness_mean< -2.546123 36   4 B (0.1111111 0.8888889)  
##      6) smoothness_worst< -1.720903 6   2 M (0.6666667 0.3333333)  
##       12) texture_mean< 3.103494 4   0 M (1.0000000 0.0000000) *
##       13) texture_mean>=3.103494 2   0 B (0.0000000 1.0000000) *
##      7) smoothness_worst>=-1.720903 30   0 B (0.0000000 1.0000000) *
## 
## $trees[[58]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 443 B (0.48574561 0.51425439)  
##     2) symmetry_worst>=-1.068249 22   0 M (1.00000000 0.00000000) *
##     3) symmetry_worst< -1.068249 890 421 B (0.47303371 0.52696629)  
##       6) texture_mean>=2.927988 554 257 M (0.53610108 0.46389892)  
##        12) smoothness_worst>=-1.473476 121  33 M (0.72727273 0.27272727)  
##          24) texture_worst< 4.76475 42   2 M (0.95238095 0.04761905)  
##            48) texture_mean>=2.934384 41   1 M (0.97560976 0.02439024)  
##              96) smoothness_mean< -2.107265 40   0 M (1.00000000 0.00000000) *
##              97) smoothness_mean>=-2.107265 1   0 B (0.00000000 1.00000000) *
##            49) texture_mean< 2.934384 1   0 B (0.00000000 1.00000000) *
##          25) texture_worst>=4.76475 79  31 M (0.60759494 0.39240506)  
##            50) texture_worst>=4.821213 69  21 M (0.69565217 0.30434783)  
##             100) compactness_se< -3.379822 56  11 M (0.80357143 0.19642857) *
##             101) compactness_se>=-3.379822 13   3 B (0.23076923 0.76923077) *
##            51) texture_worst< 4.821213 10   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.473476 433 209 B (0.48267898 0.51732102)  
##          26) smoothness_mean< -2.353249 289 125 M (0.56747405 0.43252595)  
##            52) smoothness_mean>=-2.408446 98  23 M (0.76530612 0.23469388)  
##             104) texture_worst< 4.876647 79  10 M (0.87341772 0.12658228) *
##             105) texture_worst>=4.876647 19   6 B (0.31578947 0.68421053) *
##            53) smoothness_mean< -2.408446 191  89 B (0.46596859 0.53403141)  
##             106) texture_worst< 4.592462 53   9 M (0.83018868 0.16981132) *
##             107) texture_worst>=4.592462 138  45 B (0.32608696 0.67391304) *
##          27) smoothness_mean>=-2.353249 144  45 B (0.31250000 0.68750000)  
##            54) smoothness_mean>=-2.303285 82  38 B (0.46341463 0.53658537)  
##             108) compactness_se>=-3.470794 24   5 M (0.79166667 0.20833333) *
##             109) compactness_se< -3.470794 58  19 B (0.32758621 0.67241379) *
##            55) smoothness_mean< -2.303285 62   7 B (0.11290323 0.88709677)  
##             110) texture_worst>=4.764475 14   7 M (0.50000000 0.50000000) *
##             111) texture_worst< 4.764475 48   0 B (0.00000000 1.00000000) *
##       7) texture_mean< 2.927988 336 124 B (0.36904762 0.63095238)  
##        14) symmetry_worst< -1.809006 150  70 M (0.53333333 0.46666667)  
##          28) texture_worst< 4.400796 117  40 M (0.65811966 0.34188034)  
##            56) smoothness_mean< -2.290163 98  21 M (0.78571429 0.21428571)  
##             112) texture_mean< 2.903056 90  13 M (0.85555556 0.14444444) *
##             113) texture_mean>=2.903056 8   0 B (0.00000000 1.00000000) *
##            57) smoothness_mean>=-2.290163 19   0 B (0.00000000 1.00000000) *
##          29) texture_worst>=4.400796 33   3 B (0.09090909 0.90909091)  
##            58) compactness_se< -4.431402 8   3 B (0.37500000 0.62500000)  
##             116) compactness_se>=-4.50262 3   0 M (1.00000000 0.00000000) *
##             117) compactness_se< -4.50262 5   0 B (0.00000000 1.00000000) *
##            59) compactness_se>=-4.431402 25   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst>=-1.809006 186  44 B (0.23655914 0.76344086)  
##          30) smoothness_worst>=-1.520499 118  39 B (0.33050847 0.66949153)  
##            60) smoothness_worst< -1.517609 8   0 M (1.00000000 0.00000000) *
##            61) smoothness_worst>=-1.517609 110  31 B (0.28181818 0.71818182)  
##             122) symmetry_worst>=-1.641484 76  30 B (0.39473684 0.60526316) *
##             123) symmetry_worst< -1.641484 34   1 B (0.02941176 0.97058824) *
##          31) smoothness_worst< -1.520499 68   5 B (0.07352941 0.92647059)  
##            62) compactness_se< -4.159844 19   5 B (0.26315789 0.73684211)  
##             124) compactness_se>=-4.166611 4   0 M (1.00000000 0.00000000) *
##             125) compactness_se< -4.166611 15   1 B (0.06666667 0.93333333) *
##            63) compactness_se>=-4.159844 49   0 B (0.00000000 1.00000000) *
## 
## $trees[[59]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 430 M (0.52850877 0.47149123)  
##     2) symmetry_worst>=-1.785734 501 194 M (0.61277445 0.38722555)  
##       4) texture_mean>=2.806989 443 152 M (0.65688488 0.34311512)  
##         8) compactness_se>=-3.955455 259  58 M (0.77606178 0.22393822)  
##          16) compactness_se< -2.86687 242  47 M (0.80578512 0.19421488)  
##            32) symmetry_worst< -1.128751 230  39 M (0.83043478 0.16956522)  
##              64) texture_mean< 3.36829 227  36 M (0.84140969 0.15859031) *
##              65) texture_mean>=3.36829 3   0 B (0.00000000 1.00000000) *
##            33) symmetry_worst>=-1.128751 12   4 B (0.33333333 0.66666667)  
##              66) smoothness_worst>=-1.49848 4   0 M (1.00000000 0.00000000) *
##              67) smoothness_worst< -1.49848 8   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-2.86687 17   6 B (0.35294118 0.64705882)  
##            34) texture_mean>=3.050024 5   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.050024 12   1 B (0.08333333 0.91666667)  
##              70) smoothness_mean>=-2.161865 1   0 M (1.00000000 0.00000000) *
##              71) smoothness_mean< -2.161865 11   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -3.955455 184  90 B (0.48913043 0.51086957)  
##          18) symmetry_worst< -1.733593 18   0 M (1.00000000 0.00000000) *
##          19) symmetry_worst>=-1.733593 166  72 B (0.43373494 0.56626506)  
##            38) smoothness_worst< -1.607486 30   4 M (0.86666667 0.13333333)  
##              76) texture_mean< 3.296262 26   0 M (1.00000000 0.00000000) *
##              77) texture_mean>=3.296262 4   0 B (0.00000000 1.00000000) *
##            39) smoothness_worst>=-1.607486 136  46 B (0.33823529 0.66176471)  
##              78) symmetry_worst>=-1.577444 81  39 B (0.48148148 0.51851852) *
##              79) symmetry_worst< -1.577444 55   7 B (0.12727273 0.87272727) *
##       5) texture_mean< 2.806989 58  16 B (0.27586207 0.72413793)  
##        10) smoothness_mean>=-2.232593 15   3 M (0.80000000 0.20000000)  
##          20) symmetry_worst< -1.492909 10   0 M (1.00000000 0.00000000) *
##          21) symmetry_worst>=-1.492909 5   2 B (0.40000000 0.60000000)  
##            42) smoothness_mean< -2.229802 2   0 M (1.00000000 0.00000000) *
##            43) smoothness_mean>=-2.229802 3   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.232593 43   4 B (0.09302326 0.90697674)  
##          22) compactness_se>=-3.433945 8   4 M (0.50000000 0.50000000)  
##            44) smoothness_worst>=-1.49622 4   0 M (1.00000000 0.00000000) *
##            45) smoothness_worst< -1.49622 4   0 B (0.00000000 1.00000000) *
##          23) compactness_se< -3.433945 35   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.785734 411 175 B (0.42579075 0.57420925)  
##       6) smoothness_worst>=-1.603315 345 164 B (0.47536232 0.52463768)  
##        12) smoothness_worst< -1.59596 28   1 M (0.96428571 0.03571429)  
##          24) texture_mean>=2.85796 18   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 2.85796 10   1 M (0.90000000 0.10000000)  
##            50) texture_mean< 2.793316 9   0 M (1.00000000 0.00000000) *
##            51) texture_mean>=2.793316 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst>=-1.59596 317 137 B (0.43217666 0.56782334)  
##          26) texture_mean>=3.043808 126  47 M (0.62698413 0.37301587)  
##            52) texture_mean< 3.21023 72   8 M (0.88888889 0.11111111)  
##             104) smoothness_worst>=-1.583806 68   4 M (0.94117647 0.05882353) *
##             105) smoothness_worst< -1.583806 4   0 B (0.00000000 1.00000000) *
##            53) texture_mean>=3.21023 54  15 B (0.27777778 0.72222222)  
##             106) texture_worst>=5.073596 29  14 M (0.51724138 0.48275862) *
##             107) texture_worst< 5.073596 25   0 B (0.00000000 1.00000000) *
##          27) texture_mean< 3.043808 191  58 B (0.30366492 0.69633508)  
##            54) smoothness_worst>=-1.500665 78  35 B (0.44871795 0.55128205)  
##             108) smoothness_worst< -1.476411 35  10 M (0.71428571 0.28571429) *
##             109) smoothness_worst>=-1.476411 43  10 B (0.23255814 0.76744186) *
##            55) smoothness_worst< -1.500665 113  23 B (0.20353982 0.79646018)  
##             110) smoothness_worst< -1.594361 5   0 M (1.00000000 0.00000000) *
##             111) smoothness_worst>=-1.594361 108  18 B (0.16666667 0.83333333) *
##       7) smoothness_worst< -1.603315 66  11 B (0.16666667 0.83333333)  
##        14) compactness_se< -4.6643 15   6 M (0.60000000 0.40000000)  
##          28) smoothness_mean< -2.535018 9   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-2.535018 6   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.6643 51   2 B (0.03921569 0.96078431)  
##          30) smoothness_worst< -1.720903 3   1 M (0.66666667 0.33333333)  
##            60) texture_mean>=3.026052 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 3.026052 1   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst>=-1.720903 48   0 B (0.00000000 1.00000000) *
## 
## $trees[[60]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 453 B (0.49671053 0.50328947)  
##     2) symmetry_worst>=-1.424186 86  25 M (0.70930233 0.29069767)  
##       4) texture_worst>=4.647183 42   4 M (0.90476190 0.09523810)  
##         8) smoothness_worst>=-1.49649 36   0 M (1.00000000 0.00000000) *
##         9) smoothness_worst< -1.49649 6   2 B (0.33333333 0.66666667)  
##          18) smoothness_mean< -2.311841 2   0 M (1.00000000 0.00000000) *
##          19) smoothness_mean>=-2.311841 4   0 B (0.00000000 1.00000000) *
##       5) texture_worst< 4.647183 44  21 M (0.52272727 0.47727273)  
##        10) smoothness_mean>=-2.217831 15   2 M (0.86666667 0.13333333)  
##          20) smoothness_mean< -2.022167 13   0 M (1.00000000 0.00000000) *
##          21) smoothness_mean>=-2.022167 2   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.217831 29  10 B (0.34482759 0.65517241)  
##          22) texture_worst< 4.136225 10   0 M (1.00000000 0.00000000) *
##          23) texture_worst>=4.136225 19   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.424186 826 392 B (0.47457627 0.52542373)  
##       6) texture_worst>=4.260219 718 356 M (0.50417827 0.49582173)  
##        12) compactness_se< -4.116284 223  83 M (0.62780269 0.37219731)  
##          24) symmetry_worst< -1.508268 209  70 M (0.66507177 0.33492823)  
##            48) compactness_se>=-4.705732 200  61 M (0.69500000 0.30500000)  
##              96) texture_worst>=4.339889 194  55 M (0.71649485 0.28350515) *
##              97) texture_worst< 4.339889 6   0 B (0.00000000 1.00000000) *
##            49) compactness_se< -4.705732 9   0 B (0.00000000 1.00000000) *
##          25) symmetry_worst>=-1.508268 14   1 B (0.07142857 0.92857143)  
##            50) compactness_se>=-4.234991 1   0 M (1.00000000 0.00000000) *
##            51) compactness_se< -4.234991 13   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-4.116284 495 222 B (0.44848485 0.55151515)  
##          26) texture_mean< 2.747587 23   2 M (0.91304348 0.08695652)  
##            52) texture_mean>=2.697516 21   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 2.697516 2   0 B (0.00000000 1.00000000) *
##          27) texture_mean>=2.747587 472 201 B (0.42584746 0.57415254)  
##            54) texture_mean>=2.927988 378 182 B (0.48148148 0.51851852)  
##             108) texture_mean< 2.940483 29   1 M (0.96551724 0.03448276) *
##             109) texture_mean>=2.940483 349 154 B (0.44126074 0.55873926) *
##            55) texture_mean< 2.927988 94  19 B (0.20212766 0.79787234)  
##             110) smoothness_worst>=-1.45348 13   4 M (0.69230769 0.30769231) *
##             111) smoothness_worst< -1.45348 81  10 B (0.12345679 0.87654321) *
##       7) texture_worst< 4.260219 108  30 B (0.27777778 0.72222222)  
##        14) compactness_se>=-3.894783 70  30 B (0.42857143 0.57142857)  
##          28) compactness_se< -3.48221 51  21 M (0.58823529 0.41176471)  
##            56) texture_worst< 4.206328 37   7 M (0.81081081 0.18918919)  
##             112) compactness_se>=-3.764682 22   0 M (1.00000000 0.00000000) *
##             113) compactness_se< -3.764682 15   7 M (0.53333333 0.46666667) *
##            57) texture_worst>=4.206328 14   0 B (0.00000000 1.00000000) *
##          29) compactness_se>=-3.48221 19   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -3.894783 38   0 B (0.00000000 1.00000000) *
## 
## $trees[[61]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 392 M (0.57017544 0.42982456)  
##     2) smoothness_worst>=-1.525694 520 182 M (0.65000000 0.35000000)  
##       4) texture_mean>=3.06081 166  31 M (0.81325301 0.18674699)  
##         8) smoothness_mean< -2.301586 103  10 M (0.90291262 0.09708738)  
##          16) compactness_se< -3.106177 97   4 M (0.95876289 0.04123711)  
##            32) compactness_se>=-4.507761 95   2 M (0.97894737 0.02105263)  
##              64) texture_mean< 3.355261 86   0 M (1.00000000 0.00000000) *
##              65) texture_mean>=3.355261 9   2 M (0.77777778 0.22222222) *
##            33) compactness_se< -4.507761 2   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-3.106177 6   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean>=-2.301586 63  21 M (0.66666667 0.33333333)  
##          18) smoothness_mean>=-2.257137 28   3 M (0.89285714 0.10714286)  
##            36) smoothness_mean< -2.105484 25   0 M (1.00000000 0.00000000) *
##            37) smoothness_mean>=-2.105484 3   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean< -2.257137 35  17 B (0.48571429 0.51428571)  
##            38) texture_worst>=4.965981 9   0 M (1.00000000 0.00000000) *
##            39) texture_worst< 4.965981 26   8 B (0.30769231 0.69230769)  
##              78) compactness_se>=-3.444069 6   0 M (1.00000000 0.00000000) *
##              79) compactness_se< -3.444069 20   2 B (0.10000000 0.90000000) *
##       5) texture_mean< 3.06081 354 151 M (0.57344633 0.42655367)  
##        10) smoothness_worst>=-1.496637 287 106 M (0.63066202 0.36933798)  
##          20) texture_mean< 2.987952 209  60 M (0.71291866 0.28708134)  
##            40) symmetry_worst>=-1.864441 166  34 M (0.79518072 0.20481928)  
##              80) texture_worst>=4.194566 145  21 M (0.85517241 0.14482759) *
##              81) texture_worst< 4.194566 21   8 B (0.38095238 0.61904762) *
##            41) symmetry_worst< -1.864441 43  17 B (0.39534884 0.60465116)  
##              82) smoothness_worst< -1.480334 18   3 M (0.83333333 0.16666667) *
##              83) smoothness_worst>=-1.480334 25   2 B (0.08000000 0.92000000) *
##          21) texture_mean>=2.987952 78  32 B (0.41025641 0.58974359)  
##            42) texture_worst>=4.874946 19   4 M (0.78947368 0.21052632)  
##              84) compactness_se>=-4.030876 14   0 M (1.00000000 0.00000000) *
##              85) compactness_se< -4.030876 5   1 B (0.20000000 0.80000000) *
##            43) texture_worst< 4.874946 59  17 B (0.28813559 0.71186441)  
##              86) compactness_se< -4.280193 7   1 M (0.85714286 0.14285714) *
##              87) compactness_se>=-4.280193 52  11 B (0.21153846 0.78846154) *
##        11) smoothness_worst< -1.496637 67  22 B (0.32835821 0.67164179)  
##          22) symmetry_worst< -1.736492 39  18 M (0.53846154 0.46153846)  
##            44) compactness_se< -3.210824 27   6 M (0.77777778 0.22222222)  
##              88) compactness_se>=-3.979062 21   1 M (0.95238095 0.04761905) *
##              89) compactness_se< -3.979062 6   1 B (0.16666667 0.83333333) *
##            45) compactness_se>=-3.210824 12   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst>=-1.736492 28   1 B (0.03571429 0.96428571)  
##            46) texture_mean>=3.01402 1   0 M (1.00000000 0.00000000) *
##            47) texture_mean< 3.01402 27   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst< -1.525694 392 182 B (0.46428571 0.53571429)  
##       6) compactness_se>=-3.610301 135  51 M (0.62222222 0.37777778)  
##        12) smoothness_mean>=-2.396197 68  14 M (0.79411765 0.20588235)  
##          24) smoothness_worst< -1.527573 64  10 M (0.84375000 0.15625000)  
##            48) texture_worst>=4.411908 54   4 M (0.92592593 0.07407407)  
##              96) smoothness_worst>=-1.618016 49   1 M (0.97959184 0.02040816) *
##              97) smoothness_worst< -1.618016 5   2 B (0.40000000 0.60000000) *
##            49) texture_worst< 4.411908 10   4 B (0.40000000 0.60000000)  
##              98) compactness_se< -3.492332 4   0 M (1.00000000 0.00000000) *
##              99) compactness_se>=-3.492332 6   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst>=-1.527573 4   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.396197 67  30 B (0.44776119 0.55223881)  
##          26) compactness_se< -3.593774 15   0 M (1.00000000 0.00000000) *
##          27) compactness_se>=-3.593774 52  15 B (0.28846154 0.71153846)  
##            54) smoothness_worst>=-1.604936 29  14 B (0.48275862 0.51724138)  
##             108) smoothness_worst< -1.594363 12   0 M (1.00000000 0.00000000) *
##             109) smoothness_worst>=-1.594363 17   2 B (0.11764706 0.88235294) *
##            55) smoothness_worst< -1.604936 23   1 B (0.04347826 0.95652174)  
##             110) smoothness_worst< -1.720903 1   0 M (1.00000000 0.00000000) *
##             111) smoothness_worst>=-1.720903 22   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.610301 257  98 B (0.38132296 0.61867704)  
##        14) compactness_se< -4.579712 47  14 M (0.70212766 0.29787234)  
##          28) compactness_se>=-4.711555 36   4 M (0.88888889 0.11111111)  
##            56) smoothness_worst< -1.549205 33   1 M (0.96969697 0.03030303)  
##             112) smoothness_worst>=-1.609211 25   0 M (1.00000000 0.00000000) *
##             113) smoothness_worst< -1.609211 8   1 M (0.87500000 0.12500000) *
##            57) smoothness_worst>=-1.549205 3   0 B (0.00000000 1.00000000) *
##          29) compactness_se< -4.711555 11   1 B (0.09090909 0.90909091)  
##            58) symmetry_worst>=-1.179946 1   0 M (1.00000000 0.00000000) *
##            59) symmetry_worst< -1.179946 10   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.579712 210  65 B (0.30952381 0.69047619)  
##          30) smoothness_mean< -2.382983 153  61 B (0.39869281 0.60130719)  
##            60) smoothness_mean>=-2.434747 69  25 M (0.63768116 0.36231884)  
##             120) smoothness_worst< -1.538735 59  15 M (0.74576271 0.25423729) *
##             121) smoothness_worst>=-1.538735 10   0 B (0.00000000 1.00000000) *
##            61) smoothness_mean< -2.434747 84  17 B (0.20238095 0.79761905)  
##             122) texture_worst< 4.447343 21  10 M (0.52380952 0.47619048) *
##             123) texture_worst>=4.447343 63   6 B (0.09523810 0.90476190) *
##          31) smoothness_mean>=-2.382983 57   4 B (0.07017544 0.92982456)  
##            62) compactness_se>=-3.721403 10   4 B (0.40000000 0.60000000)  
##             124) compactness_se< -3.62066 4   0 M (1.00000000 0.00000000) *
##             125) compactness_se>=-3.62066 6   0 B (0.00000000 1.00000000) *
##            63) compactness_se< -3.721403 47   0 B (0.00000000 1.00000000) *
## 
## $trees[[62]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 437 M (0.52083333 0.47916667)  
##     2) texture_mean>=2.963467 519 198 M (0.61849711 0.38150289)  
##       4) symmetry_worst>=-2.01934 408 134 M (0.67156863 0.32843137)  
##         8) smoothness_worst>=-1.660611 399 125 M (0.68671679 0.31328321)  
##          16) texture_worst>=5.06141 82  12 M (0.85365854 0.14634146)  
##            32) symmetry_worst< -1.450078 73   6 M (0.91780822 0.08219178)  
##              64) texture_worst< 5.386175 56   0 M (1.00000000 0.00000000) *
##              65) texture_worst>=5.386175 17   6 M (0.64705882 0.35294118) *
##            33) symmetry_worst>=-1.450078 9   3 B (0.33333333 0.66666667)  
##              66) texture_mean< 3.217018 3   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=3.217018 6   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 5.06141 317 113 M (0.64353312 0.35646688)  
##            34) smoothness_mean>=-2.42138 234  68 M (0.70940171 0.29059829)  
##              68) smoothness_mean< -2.352368 64   7 M (0.89062500 0.10937500) *
##              69) smoothness_mean>=-2.352368 170  61 M (0.64117647 0.35882353) *
##            35) smoothness_mean< -2.42138 83  38 B (0.45783133 0.54216867)  
##              70) compactness_se< -4.014684 46  13 M (0.71739130 0.28260870) *
##              71) compactness_se>=-4.014684 37   5 B (0.13513514 0.86486486) *
##         9) smoothness_worst< -1.660611 9   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst< -2.01934 111  47 B (0.42342342 0.57657658)  
##        10) compactness_se>=-3.551587 48  15 M (0.68750000 0.31250000)  
##          20) smoothness_worst>=-1.601489 36   6 M (0.83333333 0.16666667)  
##            40) texture_worst< 5.216315 32   2 M (0.93750000 0.06250000)  
##              80) texture_mean>=3.049609 30   0 M (1.00000000 0.00000000) *
##              81) texture_mean< 3.049609 2   0 B (0.00000000 1.00000000) *
##            41) texture_worst>=5.216315 4   0 B (0.00000000 1.00000000) *
##          21) smoothness_worst< -1.601489 12   3 B (0.25000000 0.75000000)  
##            42) texture_worst< 4.59024 4   1 M (0.75000000 0.25000000)  
##              84) texture_mean>=3.032942 3   0 M (1.00000000 0.00000000) *
##              85) texture_mean< 3.032942 1   0 B (0.00000000 1.00000000) *
##            43) texture_worst>=4.59024 8   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -3.551587 63  14 B (0.22222222 0.77777778)  
##          22) texture_mean< 3.038878 8   1 M (0.87500000 0.12500000)  
##            44) texture_mean>=3.000441 7   0 M (1.00000000 0.00000000) *
##            45) texture_mean< 3.000441 1   0 B (0.00000000 1.00000000) *
##          23) texture_mean>=3.038878 55   7 B (0.12727273 0.87272727)  
##            46) smoothness_mean>=-2.290398 1   0 M (1.00000000 0.00000000) *
##            47) smoothness_mean< -2.290398 54   6 B (0.11111111 0.88888889)  
##              94) symmetry_worst>=-2.052205 17   5 B (0.29411765 0.70588235) *
##              95) symmetry_worst< -2.052205 37   1 B (0.02702703 0.97297297) *
##     3) texture_mean< 2.963467 393 154 B (0.39185751 0.60814249)  
##       6) smoothness_worst>=-1.451541 71  15 M (0.78873239 0.21126761)  
##        12) smoothness_mean< -2.240129 38   0 M (1.00000000 0.00000000) *
##        13) smoothness_mean>=-2.240129 33  15 M (0.54545455 0.45454545)  
##          26) texture_mean< 2.757784 17   3 M (0.82352941 0.17647059)  
##            52) smoothness_mean< -1.889548 16   2 M (0.87500000 0.12500000)  
##             104) compactness_se>=-3.896708 15   1 M (0.93333333 0.06666667) *
##             105) compactness_se< -3.896708 1   0 B (0.00000000 1.00000000) *
##            53) smoothness_mean>=-1.889548 1   0 B (0.00000000 1.00000000) *
##          27) texture_mean>=2.757784 16   4 B (0.25000000 0.75000000)  
##            54) texture_mean>=2.935178 3   0 M (1.00000000 0.00000000) *
##            55) texture_mean< 2.935178 13   1 B (0.07692308 0.92307692)  
##             110) smoothness_worst< -1.437625 1   0 M (1.00000000 0.00000000) *
##             111) smoothness_worst>=-1.437625 12   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.451541 322  98 B (0.30434783 0.69565217)  
##        14) symmetry_worst>=-1.327359 23   6 M (0.73913043 0.26086957)  
##          28) symmetry_worst< -1.23578 13   0 M (1.00000000 0.00000000) *
##          29) symmetry_worst>=-1.23578 10   4 B (0.40000000 0.60000000)  
##            58) smoothness_worst< -1.461111 4   0 M (1.00000000 0.00000000) *
##            59) smoothness_worst>=-1.461111 6   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.327359 299  81 B (0.27090301 0.72909699)  
##          30) smoothness_mean< -2.441446 77  38 M (0.50649351 0.49350649)  
##            60) symmetry_worst>=-1.816978 43   9 M (0.79069767 0.20930233)  
##             120) texture_worst< 4.644924 38   4 M (0.89473684 0.10526316) *
##             121) texture_worst>=4.644924 5   0 B (0.00000000 1.00000000) *
##            61) symmetry_worst< -1.816978 34   5 B (0.14705882 0.85294118)  
##             122) smoothness_mean>=-2.443746 5   0 M (1.00000000 0.00000000) *
##             123) smoothness_mean< -2.443746 29   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean>=-2.441446 222  42 B (0.18918919 0.81081081)  
##            62) compactness_se< -4.116284 58  20 B (0.34482759 0.65517241)  
##             124) compactness_se>=-4.201715 17   2 M (0.88235294 0.11764706) *
##             125) compactness_se< -4.201715 41   5 B (0.12195122 0.87804878) *
##            63) compactness_se>=-4.116284 164  22 B (0.13414634 0.86585366)  
##             126) smoothness_worst>=-1.472307 12   5 B (0.41666667 0.58333333) *
##             127) smoothness_worst< -1.472307 152  17 B (0.11184211 0.88815789) *
## 
## $trees[[63]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 M (0.51535088 0.48464912)  
##     2) texture_mean>=3.065024 297 112 M (0.62289562 0.37710438)  
##       4) smoothness_worst>=-1.618721 268  86 M (0.67910448 0.32089552)  
##         8) texture_worst< 4.781945 55   1 M (0.98181818 0.01818182)  
##          16) texture_worst>=4.52814 54   0 M (1.00000000 0.00000000) *
##          17) texture_worst< 4.52814 1   0 B (0.00000000 1.00000000) *
##         9) texture_worst>=4.781945 213  85 M (0.60093897 0.39906103)  
##          18) texture_worst>=4.820212 185  63 M (0.65945946 0.34054054)  
##            36) symmetry_worst>=-1.71268 74  13 M (0.82432432 0.17567568)  
##              72) smoothness_mean>=-2.509617 71  10 M (0.85915493 0.14084507) *
##              73) smoothness_mean< -2.509617 3   0 B (0.00000000 1.00000000) *
##            37) symmetry_worst< -1.71268 111  50 M (0.54954955 0.45045045)  
##              74) symmetry_worst< -1.733593 91  30 M (0.67032967 0.32967033) *
##              75) symmetry_worst>=-1.733593 20   0 B (0.00000000 1.00000000) *
##          19) texture_worst< 4.820212 28   6 B (0.21428571 0.78571429)  
##            38) compactness_se>=-3.052779 3   0 M (1.00000000 0.00000000) *
##            39) compactness_se< -3.052779 25   3 B (0.12000000 0.88000000)  
##              78) smoothness_worst>=-1.444063 3   0 M (1.00000000 0.00000000) *
##              79) smoothness_worst< -1.444063 22   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.618721 29   3 B (0.10344828 0.89655172)  
##        10) texture_worst< 4.609308 4   1 M (0.75000000 0.25000000)  
##          20) texture_mean>=3.075433 3   0 M (1.00000000 0.00000000) *
##          21) texture_mean< 3.075433 1   0 B (0.00000000 1.00000000) *
##        11) texture_worst>=4.609308 25   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 3.065024 615 285 B (0.46341463 0.53658537)  
##       6) smoothness_worst>=-1.472307 174  62 M (0.64367816 0.35632184)  
##        12) compactness_se>=-4.032549 123  32 M (0.73983740 0.26016260)  
##          24) compactness_se< -3.931945 30   0 M (1.00000000 0.00000000) *
##          25) compactness_se>=-3.931945 93  32 M (0.65591398 0.34408602)  
##            50) symmetry_worst< -1.486964 68  16 M (0.76470588 0.23529412)  
##             100) symmetry_worst>=-1.828219 53   5 M (0.90566038 0.09433962) *
##             101) symmetry_worst< -1.828219 15   4 B (0.26666667 0.73333333) *
##            51) symmetry_worst>=-1.486964 25   9 B (0.36000000 0.64000000)  
##             102) smoothness_mean>=-2.287745 16   7 M (0.56250000 0.43750000) *
##             103) smoothness_mean< -2.287745 9   0 B (0.00000000 1.00000000) *
##        13) compactness_se< -4.032549 51  21 B (0.41176471 0.58823529)  
##          26) smoothness_worst< -1.458214 22   2 M (0.90909091 0.09090909)  
##            52) texture_mean>=2.901883 20   0 M (1.00000000 0.00000000) *
##            53) texture_mean< 2.901883 2   0 B (0.00000000 1.00000000) *
##          27) smoothness_worst>=-1.458214 29   1 B (0.03448276 0.96551724)  
##            54) symmetry_worst< -1.741496 4   1 B (0.25000000 0.75000000)  
##             108) symmetry_worst>=-1.780237 1   0 M (1.00000000 0.00000000) *
##             109) symmetry_worst< -1.780237 3   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst>=-1.741496 25   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.472307 441 173 B (0.39229025 0.60770975)  
##        14) smoothness_worst< -1.476997 400 171 B (0.42750000 0.57250000)  
##          28) smoothness_worst>=-1.482701 34   3 M (0.91176471 0.08823529)  
##            56) compactness_se>=-4.290267 32   1 M (0.96875000 0.03125000)  
##             112) texture_mean>=2.732378 31   0 M (1.00000000 0.00000000) *
##             113) texture_mean< 2.732378 1   0 B (0.00000000 1.00000000) *
##            57) compactness_se< -4.290267 2   0 B (0.00000000 1.00000000) *
##          29) smoothness_worst< -1.482701 366 140 B (0.38251366 0.61748634)  
##            58) compactness_se< -3.476676 299 133 B (0.44481605 0.55518395)  
##             116) compactness_se>=-3.716111 71   9 M (0.87323944 0.12676056) *
##             117) compactness_se< -3.716111 228  71 B (0.31140351 0.68859649) *
##            59) compactness_se>=-3.476676 67   7 B (0.10447761 0.89552239)  
##             118) symmetry_worst>=-1.474719 7   1 M (0.85714286 0.14285714) *
##             119) symmetry_worst< -1.474719 60   1 B (0.01666667 0.98333333) *
##        15) smoothness_worst>=-1.476997 41   2 B (0.04878049 0.95121951)  
##          30) texture_worst>=4.844547 2   0 M (1.00000000 0.00000000) *
##          31) texture_worst< 4.844547 39   0 B (0.00000000 1.00000000) *
## 
## $trees[[64]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 433 B (0.47478070 0.52521930)  
##     2) compactness_se< -4.49319 106  31 M (0.70754717 0.29245283)  
##       4) compactness_se>=-4.704842 95  20 M (0.78947368 0.21052632)  
##         8) symmetry_worst< -1.509002 87  12 M (0.86206897 0.13793103)  
##          16) texture_mean< 3.232565 84   9 M (0.89285714 0.10714286)  
##            32) texture_mean>=2.846651 82   7 M (0.91463415 0.08536585)  
##              64) smoothness_mean< -2.295268 80   5 M (0.93750000 0.06250000) *
##              65) smoothness_mean>=-2.295268 2   0 B (0.00000000 1.00000000) *
##            33) texture_mean< 2.846651 2   0 B (0.00000000 1.00000000) *
##          17) texture_mean>=3.232565 3   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.509002 8   0 B (0.00000000 1.00000000) *
##       5) compactness_se< -4.704842 11   0 B (0.00000000 1.00000000) *
##     3) compactness_se>=-4.49319 806 358 B (0.44416873 0.55583127)  
##       6) smoothness_worst>=-1.559144 598 295 B (0.49331104 0.50668896)  
##        12) symmetry_worst< -1.781339 272 109 M (0.59926471 0.40073529)  
##          24) smoothness_mean< -2.313857 158  42 M (0.73417722 0.26582278)  
##            48) compactness_se< -3.455891 144  29 M (0.79861111 0.20138889)  
##              96) symmetry_worst>=-2.233349 134  19 M (0.85820896 0.14179104) *
##              97) symmetry_worst< -2.233349 10   0 B (0.00000000 1.00000000) *
##            49) compactness_se>=-3.455891 14   1 B (0.07142857 0.92857143)  
##              98) smoothness_mean< -2.465359 1   0 M (1.00000000 0.00000000) *
##              99) smoothness_mean>=-2.465359 13   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean>=-2.313857 114  47 B (0.41228070 0.58771930)  
##            50) smoothness_worst>=-1.499656 76  33 M (0.56578947 0.43421053)  
##             100) smoothness_mean< -2.219625 49  13 M (0.73469388 0.26530612) *
##             101) smoothness_mean>=-2.219625 27   7 B (0.25925926 0.74074074) *
##            51) smoothness_worst< -1.499656 38   4 B (0.10526316 0.89473684)  
##             102) compactness_se>=-3.239565 6   3 M (0.50000000 0.50000000) *
##             103) compactness_se< -3.239565 32   1 B (0.03125000 0.96875000) *
##        13) symmetry_worst>=-1.781339 326 132 B (0.40490798 0.59509202)  
##          26) symmetry_worst>=-1.524537 86  36 M (0.58139535 0.41860465)  
##            52) texture_mean>=2.777879 68  20 M (0.70588235 0.29411765)  
##             104) symmetry_worst< -1.124686 55  10 M (0.81818182 0.18181818) *
##             105) symmetry_worst>=-1.124686 13   3 B (0.23076923 0.76923077) *
##            53) texture_mean< 2.777879 18   2 B (0.11111111 0.88888889)  
##             106) compactness_se>=-3.173162 2   0 M (1.00000000 0.00000000) *
##             107) compactness_se< -3.173162 16   0 B (0.00000000 1.00000000) *
##          27) symmetry_worst< -1.524537 240  82 B (0.34166667 0.65833333)  
##            54) smoothness_mean>=-2.413908 195  78 B (0.40000000 0.60000000)  
##             108) smoothness_worst< -1.531349 15   0 M (1.00000000 0.00000000) *
##             109) smoothness_worst>=-1.531349 180  63 B (0.35000000 0.65000000) *
##            55) smoothness_mean< -2.413908 45   4 B (0.08888889 0.91111111)  
##             110) texture_worst>=5.003123 4   0 M (1.00000000 0.00000000) *
##             111) texture_worst< 5.003123 41   0 B (0.00000000 1.00000000) *
##       7) smoothness_worst< -1.559144 208  63 B (0.30288462 0.69711538)  
##        14) smoothness_mean>=-2.302636 14   2 M (0.85714286 0.14285714)  
##          28) compactness_se>=-3.929833 12   0 M (1.00000000 0.00000000) *
##          29) compactness_se< -3.929833 2   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.302636 194  51 B (0.26288660 0.73711340)  
##          30) symmetry_worst>=-1.538661 25   8 M (0.68000000 0.32000000)  
##            60) texture_mean>=2.989073 17   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.989073 8   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst< -1.538661 169  34 B (0.20118343 0.79881657)  
##            62) compactness_se>=-3.489046 38  16 B (0.42105263 0.57894737)  
##             124) texture_mean>=3.136493 8   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 3.136493 30   8 B (0.26666667 0.73333333) *
##            63) compactness_se< -3.489046 131  18 B (0.13740458 0.86259542)  
##             126) texture_worst< 4.679785 66  18 B (0.27272727 0.72727273) *
##             127) texture_worst>=4.679785 65   0 B (0.00000000 1.00000000) *
## 
## $trees[[65]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 435 M (0.52302632 0.47697368)  
##     2) texture_worst>=4.260219 805 357 M (0.55652174 0.44347826)  
##       4) smoothness_worst>=-1.424105 61   9 M (0.85245902 0.14754098)  
##         8) smoothness_mean>=-2.361754 57   5 M (0.91228070 0.08771930)  
##          16) compactness_se>=-4.130421 56   4 M (0.92857143 0.07142857)  
##            32) smoothness_mean< -2.093138 48   1 M (0.97916667 0.02083333)  
##              64) smoothness_mean< -2.170242 39   0 M (1.00000000 0.00000000) *
##              65) smoothness_mean>=-2.170242 9   1 M (0.88888889 0.11111111) *
##            33) smoothness_mean>=-2.093138 8   3 M (0.62500000 0.37500000)  
##              66) texture_mean< 2.970462 5   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=2.970462 3   0 B (0.00000000 1.00000000) *
##          17) compactness_se< -4.130421 1   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.361754 4   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.424105 744 348 M (0.53225806 0.46774194)  
##        10) smoothness_mean< -2.261445 642 281 M (0.56230530 0.43769470)  
##          20) symmetry_worst>=-2.01934 506 196 M (0.61264822 0.38735178)  
##            40) smoothness_worst>=-1.52112 258  74 M (0.71317829 0.28682171)  
##              80) texture_worst< 4.545891 84  10 M (0.88095238 0.11904762) *
##              81) texture_worst>=4.545891 174  64 M (0.63218391 0.36781609) *
##            41) smoothness_worst< -1.52112 248 122 M (0.50806452 0.49193548)  
##              82) symmetry_worst>=-1.549706 56  16 M (0.71428571 0.28571429) *
##              83) symmetry_worst< -1.549706 192  86 B (0.44791667 0.55208333) *
##          21) symmetry_worst< -2.01934 136  51 B (0.37500000 0.62500000)  
##            42) symmetry_worst< -2.49184 17   1 M (0.94117647 0.05882353)  
##              84) texture_mean< 3.310501 16   0 M (1.00000000 0.00000000) *
##              85) texture_mean>=3.310501 1   0 B (0.00000000 1.00000000) *
##            43) symmetry_worst>=-2.49184 119  35 B (0.29411765 0.70588235)  
##              86) smoothness_mean< -2.352958 83  33 B (0.39759036 0.60240964) *
##              87) smoothness_mean>=-2.352958 36   2 B (0.05555556 0.94444444) *
##        11) smoothness_mean>=-2.261445 102  35 B (0.34313725 0.65686275)  
##          22) smoothness_mean>=-2.201842 16   2 M (0.87500000 0.12500000)  
##            44) texture_worst>=4.450297 14   0 M (1.00000000 0.00000000) *
##            45) texture_worst< 4.450297 2   0 B (0.00000000 1.00000000) *
##          23) smoothness_mean< -2.201842 86  21 B (0.24418605 0.75581395)  
##            46) texture_mean>=3.050442 13   4 M (0.69230769 0.30769231)  
##              92) compactness_se>=-4.008292 9   0 M (1.00000000 0.00000000) *
##              93) compactness_se< -4.008292 4   0 B (0.00000000 1.00000000) *
##            47) texture_mean< 3.050442 73  12 B (0.16438356 0.83561644)  
##              94) symmetry_worst>=-1.802807 41  12 B (0.29268293 0.70731707) *
##              95) symmetry_worst< -1.802807 32   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.260219 107  29 B (0.27102804 0.72897196)  
##       6) symmetry_worst>=-1.567877 22   7 M (0.68181818 0.31818182)  
##        12) texture_mean>=2.756192 10   0 M (1.00000000 0.00000000) *
##        13) texture_mean< 2.756192 12   5 B (0.41666667 0.58333333)  
##          26) texture_mean< 2.518783 4   0 M (1.00000000 0.00000000) *
##          27) texture_mean>=2.518783 8   1 B (0.12500000 0.87500000)  
##            54) compactness_se>=-3.3026 1   0 M (1.00000000 0.00000000) *
##            55) compactness_se< -3.3026 7   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.567877 85  14 B (0.16470588 0.83529412)  
##        14) smoothness_worst< -1.54469 39  12 B (0.30769231 0.69230769)  
##          28) smoothness_worst>=-1.545117 6   0 M (1.00000000 0.00000000) *
##          29) smoothness_worst< -1.545117 33   6 B (0.18181818 0.81818182)  
##            58) texture_mean>=2.764104 12   6 M (0.50000000 0.50000000)  
##             116) compactness_se>=-3.607729 6   0 M (1.00000000 0.00000000) *
##             117) compactness_se< -3.607729 6   0 B (0.00000000 1.00000000) *
##            59) texture_mean< 2.764104 21   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst>=-1.54469 46   2 B (0.04347826 0.95652174)  
##          30) compactness_se< -3.823116 17   2 B (0.11764706 0.88235294)  
##            60) compactness_se>=-3.894783 2   0 M (1.00000000 0.00000000) *
##            61) compactness_se< -3.894783 15   0 B (0.00000000 1.00000000) *
##          31) compactness_se>=-3.823116 29   0 B (0.00000000 1.00000000) *
## 
## $trees[[66]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 425 M (0.53399123 0.46600877)  
##     2) texture_mean>=2.824054 799 353 M (0.55819775 0.44180225)  
##       4) texture_worst< 4.644679 402 144 M (0.64179104 0.35820896)  
##         8) smoothness_worst>=-1.451542 57   2 M (0.96491228 0.03508772)  
##          16) texture_worst>=4.226553 56   1 M (0.98214286 0.01785714)  
##            32) smoothness_worst< -1.349735 54   0 M (1.00000000 0.00000000) *
##            33) smoothness_worst>=-1.349735 2   1 M (0.50000000 0.50000000)  
##              66) texture_mean>=2.957438 1   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 2.957438 1   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 4.226553 1   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.451542 345 142 M (0.58840580 0.41159420)  
##          18) smoothness_worst< -1.476997 312 111 M (0.64423077 0.35576923)  
##            36) texture_worst>=4.614897 56   4 M (0.92857143 0.07142857)  
##              72) compactness_se>=-4.694501 53   1 M (0.98113208 0.01886792) *
##              73) compactness_se< -4.694501 3   0 B (0.00000000 1.00000000) *
##            37) texture_worst< 4.614897 256 107 M (0.58203125 0.41796875)  
##              74) symmetry_worst< -1.561818 223  80 M (0.64125561 0.35874439) *
##              75) symmetry_worst>=-1.561818 33   6 B (0.18181818 0.81818182) *
##          19) smoothness_worst>=-1.476997 33   2 B (0.06060606 0.93939394)  
##            38) texture_mean>=2.932513 3   1 M (0.66666667 0.33333333)  
##              76) texture_mean< 2.963141 2   0 M (1.00000000 0.00000000) *
##              77) texture_mean>=2.963141 1   0 B (0.00000000 1.00000000) *
##            39) texture_mean< 2.932513 30   0 B (0.00000000 1.00000000) *
##       5) texture_worst>=4.644679 397 188 B (0.47355164 0.52644836)  
##        10) symmetry_worst>=-1.660659 184  71 M (0.61413043 0.38586957)  
##          20) symmetry_worst< -1.608146 43   2 M (0.95348837 0.04651163)  
##            40) texture_mean>=2.970637 42   1 M (0.97619048 0.02380952)  
##              80) texture_mean< 3.129266 41   0 M (1.00000000 0.00000000) *
##              81) texture_mean>=3.129266 1   0 B (0.00000000 1.00000000) *
##            41) texture_mean< 2.970637 1   0 B (0.00000000 1.00000000) *
##          21) symmetry_worst>=-1.608146 141  69 M (0.51063830 0.48936170)  
##            42) smoothness_mean>=-2.281815 28   0 M (1.00000000 0.00000000) *
##            43) smoothness_mean< -2.281815 113  44 B (0.38938053 0.61061947)  
##              86) compactness_se< -4.081893 36  10 M (0.72222222 0.27777778) *
##              87) compactness_se>=-4.081893 77  18 B (0.23376623 0.76623377) *
##        11) symmetry_worst< -1.660659 213  75 B (0.35211268 0.64788732)  
##          22) texture_worst>=4.837624 125  59 B (0.47200000 0.52800000)  
##            44) texture_worst< 4.985267 25   2 M (0.92000000 0.08000000)  
##              88) symmetry_worst>=-2.207844 24   1 M (0.95833333 0.04166667) *
##              89) symmetry_worst< -2.207844 1   0 B (0.00000000 1.00000000) *
##            45) texture_worst>=4.985267 100  36 B (0.36000000 0.64000000)  
##              90) compactness_se>=-4.248059 63  30 B (0.47619048 0.52380952) *
##              91) compactness_se< -4.248059 37   6 B (0.16216216 0.83783784) *
##          23) texture_worst< 4.837624 88  16 B (0.18181818 0.81818182)  
##            46) compactness_se>=-2.790746 6   0 M (1.00000000 0.00000000) *
##            47) compactness_se< -2.790746 82  10 B (0.12195122 0.87804878)  
##              94) symmetry_worst< -2.121358 22   9 B (0.40909091 0.59090909) *
##              95) symmetry_worst>=-2.121358 60   1 B (0.01666667 0.98333333) *
##     3) texture_mean< 2.824054 113  41 B (0.36283186 0.63716814)  
##       6) compactness_se>=-3.964431 83  41 B (0.49397590 0.50602410)  
##        12) texture_worst< 4.328009 73  32 M (0.56164384 0.43835616)  
##          24) texture_worst>=3.804403 63  23 M (0.63492063 0.36507937)  
##            48) texture_mean< 2.771335 45  11 M (0.75555556 0.24444444)  
##              96) compactness_se>=-3.891799 43   9 M (0.79069767 0.20930233) *
##              97) compactness_se< -3.891799 2   0 B (0.00000000 1.00000000) *
##            49) texture_mean>=2.771335 18   6 B (0.33333333 0.66666667)  
##              98) texture_mean>=2.811204 6   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 2.811204 12   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 3.804403 10   1 B (0.10000000 0.90000000)  
##            50) smoothness_mean< -2.298096 2   1 M (0.50000000 0.50000000)  
##             100) texture_mean< 2.673405 1   0 M (1.00000000 0.00000000) *
##             101) texture_mean>=2.673405 1   0 B (0.00000000 1.00000000) *
##            51) smoothness_mean>=-2.298096 8   0 B (0.00000000 1.00000000) *
##        13) texture_worst>=4.328009 10   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.964431 30   0 B (0.00000000 1.00000000) *
## 
## $trees[[67]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 448 M (0.50877193 0.49122807)  
##    2) smoothness_mean>=-2.546123 895 431 M (0.51843575 0.48156425)  
##      4) texture_worst>=4.572846 507 213 M (0.57988166 0.42011834)  
##        8) smoothness_worst< -1.484675 342 116 M (0.66081871 0.33918129)  
##         16) smoothness_mean>=-2.403622 165  34 M (0.79393939 0.20606061)  
##           32) symmetry_worst>=-2.207988 153  22 M (0.85620915 0.14379085)  
##             64) smoothness_mean< -2.286221 131   9 M (0.93129771 0.06870229) *
##             65) smoothness_mean>=-2.286221 22   9 B (0.40909091 0.59090909) *
##           33) symmetry_worst< -2.207988 12   0 B (0.00000000 1.00000000) *
##         17) smoothness_mean< -2.403622 177  82 M (0.53672316 0.46327684)  
##           34) smoothness_worst< -1.568573 83  21 M (0.74698795 0.25301205)  
##             68) symmetry_worst>=-1.966444 71  11 M (0.84507042 0.15492958) *
##             69) symmetry_worst< -1.966444 12   2 B (0.16666667 0.83333333) *
##           35) smoothness_worst>=-1.568573 94  33 B (0.35106383 0.64893617)  
##             70) texture_worst< 4.982438 56  27 M (0.51785714 0.48214286) *
##             71) texture_worst>=4.982438 38   4 B (0.10526316 0.89473684) *
##        9) smoothness_worst>=-1.484675 165  68 B (0.41212121 0.58787879)  
##         18) smoothness_mean>=-2.284747 59  21 M (0.64406780 0.35593220)  
##           36) compactness_se>=-4.032549 44   9 M (0.79545455 0.20454545)  
##             72) smoothness_mean< -2.093138 37   2 M (0.94594595 0.05405405) *
##             73) smoothness_mean>=-2.093138 7   0 B (0.00000000 1.00000000) *
##           37) compactness_se< -4.032549 15   3 B (0.20000000 0.80000000)  
##             74) texture_mean< 2.979048 5   2 M (0.60000000 0.40000000) *
##             75) texture_mean>=2.979048 10   0 B (0.00000000 1.00000000) *
##         19) smoothness_mean< -2.284747 106  30 B (0.28301887 0.71698113)  
##           38) texture_worst< 4.624204 5   0 M (1.00000000 0.00000000) *
##           39) texture_worst>=4.624204 101  25 B (0.24752475 0.75247525)  
##             78) symmetry_worst>=-1.650994 55  23 B (0.41818182 0.58181818) *
##             79) symmetry_worst< -1.650994 46   2 B (0.04347826 0.95652174) *
##      5) texture_worst< 4.572846 388 170 B (0.43814433 0.56185567)  
##       10) texture_worst< 4.54138 343 167 B (0.48688047 0.51311953)  
##         20) smoothness_worst>=-1.451731 63  16 M (0.74603175 0.25396825)  
##           40) compactness_se>=-4.086695 55   8 M (0.85454545 0.14545455)  
##             80) symmetry_worst< -1.395041 46   3 M (0.93478261 0.06521739) *
##             81) symmetry_worst>=-1.395041 9   4 B (0.44444444 0.55555556) *
##           41) compactness_se< -4.086695 8   0 B (0.00000000 1.00000000) *
##         21) smoothness_worst< -1.451731 280 120 B (0.42857143 0.57142857)  
##           42) symmetry_worst< -2.401622 17   0 M (1.00000000 0.00000000) *
##           43) symmetry_worst>=-2.401622 263 103 B (0.39163498 0.60836502)  
##             86) texture_worst>=4.535341 14   0 M (1.00000000 0.00000000) *
##             87) texture_worst< 4.535341 249  89 B (0.35742972 0.64257028) *
##       11) texture_worst>=4.54138 45   3 B (0.06666667 0.93333333)  
##         22) smoothness_mean>=-2.282906 6   3 M (0.50000000 0.50000000)  
##           44) texture_mean>=2.943507 3   0 M (1.00000000 0.00000000) *
##           45) texture_mean< 2.943507 3   0 B (0.00000000 1.00000000) *
##         23) smoothness_mean< -2.282906 39   0 B (0.00000000 1.00000000) *
##    3) smoothness_mean< -2.546123 17   0 B (0.00000000 1.00000000) *
## 
## $trees[[68]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 400 M (0.56140351 0.43859649)  
##     2) compactness_se>=-3.721197 376 131 M (0.65159574 0.34840426)  
##       4) symmetry_worst>=-1.892495 256  69 M (0.73046875 0.26953125)  
##         8) texture_worst>=3.969009 249  63 M (0.74698795 0.25301205)  
##          16) compactness_se< -3.494301 97  13 M (0.86597938 0.13402062)  
##            32) smoothness_mean>=-2.380711 63   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean< -2.380711 34  13 M (0.61764706 0.38235294)  
##              66) compactness_se< -3.5866 25   4 M (0.84000000 0.16000000) *
##              67) compactness_se>=-3.5866 9   0 B (0.00000000 1.00000000) *
##          17) compactness_se>=-3.494301 152  50 M (0.67105263 0.32894737)  
##            34) compactness_se>=-3.484318 131  29 M (0.77862595 0.22137405)  
##              68) compactness_se< -2.919705 112  19 M (0.83035714 0.16964286) *
##              69) compactness_se>=-2.919705 19   9 B (0.47368421 0.52631579) *
##            35) compactness_se< -3.484318 21   0 B (0.00000000 1.00000000) *
##         9) texture_worst< 3.969009 7   1 B (0.14285714 0.85714286)  
##          18) texture_mean< 2.366153 1   0 M (1.00000000 0.00000000) *
##          19) texture_mean>=2.366153 6   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst< -1.892495 120  58 B (0.48333333 0.51666667)  
##        10) symmetry_worst< -1.982941 78  25 M (0.67948718 0.32051282)  
##          20) symmetry_worst>=-2.174839 40   4 M (0.90000000 0.10000000)  
##            40) texture_mean< 3.304787 38   2 M (0.94736842 0.05263158)  
##              80) smoothness_worst>=-1.604936 32   0 M (1.00000000 0.00000000) *
##              81) smoothness_worst< -1.604936 6   2 M (0.66666667 0.33333333) *
##            41) texture_mean>=3.304787 2   0 B (0.00000000 1.00000000) *
##          21) symmetry_worst< -2.174839 38  17 B (0.44736842 0.55263158)  
##            42) smoothness_mean< -2.437515 13   1 M (0.92307692 0.07692308)  
##              84) smoothness_mean>=-2.490273 12   0 M (1.00000000 0.00000000) *
##              85) smoothness_mean< -2.490273 1   0 B (0.00000000 1.00000000) *
##            43) smoothness_mean>=-2.437515 25   5 B (0.20000000 0.80000000)  
##              86) texture_mean>=3.190563 9   4 M (0.55555556 0.44444444) *
##              87) texture_mean< 3.190563 16   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst>=-1.982941 42   5 B (0.11904762 0.88095238)  
##          22) texture_mean>=3.088324 5   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.088324 37   0 B (0.00000000 1.00000000) *
##     3) compactness_se< -3.721197 536 267 B (0.49813433 0.50186567)  
##       6) compactness_se< -3.859436 458 209 M (0.54366812 0.45633188)  
##        12) texture_mean>=2.934384 269 100 M (0.62825279 0.37174721)  
##          24) compactness_se>=-4.28781 168  45 M (0.73214286 0.26785714)  
##            48) smoothness_mean< -2.290664 130  23 M (0.82307692 0.17692308)  
##              96) compactness_se< -3.869459 125  18 M (0.85600000 0.14400000) *
##              97) compactness_se>=-3.869459 5   0 B (0.00000000 1.00000000) *
##            49) smoothness_mean>=-2.290664 38  16 B (0.42105263 0.57894737)  
##              98) smoothness_mean>=-2.251921 15   2 M (0.86666667 0.13333333) *
##              99) smoothness_mean< -2.251921 23   3 B (0.13043478 0.86956522) *
##          25) compactness_se< -4.28781 101  46 B (0.45544554 0.54455446)  
##            50) texture_mean< 3.227241 71  25 M (0.64788732 0.35211268)  
##             100) compactness_se< -4.335534 64  18 M (0.71875000 0.28125000) *
##             101) compactness_se>=-4.335534 7   0 B (0.00000000 1.00000000) *
##            51) texture_mean>=3.227241 30   0 B (0.00000000 1.00000000) *
##        13) texture_mean< 2.934384 189  80 B (0.42328042 0.57671958)  
##          26) texture_mean< 2.898946 155  77 B (0.49677419 0.50322581)  
##            52) texture_mean>=2.876103 48  10 M (0.79166667 0.20833333)  
##             104) symmetry_worst< -1.701169 41   3 M (0.92682927 0.07317073) *
##             105) symmetry_worst>=-1.701169 7   0 B (0.00000000 1.00000000) *
##            53) texture_mean< 2.876103 107  39 B (0.36448598 0.63551402)  
##             106) smoothness_worst>=-1.542984 68  29 M (0.57352941 0.42647059) *
##             107) smoothness_worst< -1.542984 39   0 B (0.00000000 1.00000000) *
##          27) texture_mean>=2.898946 34   3 B (0.08823529 0.91176471)  
##            54) texture_worst>=4.707428 2   0 M (1.00000000 0.00000000) *
##            55) texture_worst< 4.707428 32   1 B (0.03125000 0.96875000)  
##             110) compactness_se< -4.680858 1   0 M (1.00000000 0.00000000) *
##             111) compactness_se>=-4.680858 31   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-3.859436 78  18 B (0.23076923 0.76923077)  
##        14) smoothness_worst>=-1.480731 33  15 M (0.54545455 0.45454545)  
##          28) texture_mean>=2.971675 12   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.971675 21   6 B (0.28571429 0.71428571)  
##            58) symmetry_worst>=-1.612049 6   0 M (1.00000000 0.00000000) *
##            59) symmetry_worst< -1.612049 15   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst< -1.480731 45   0 B (0.00000000 1.00000000) *
## 
## $trees[[69]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 455 B (0.49890351 0.50109649)  
##     2) smoothness_mean>=-2.416986 622 279 M (0.55144695 0.44855305)  
##       4) smoothness_mean< -2.349943 198  63 M (0.68181818 0.31818182)  
##         8) symmetry_worst>=-2.212871 182  47 M (0.74175824 0.25824176)  
##          16) texture_worst>=4.613791 108  11 M (0.89814815 0.10185185)  
##            32) smoothness_mean>=-2.408892 104   7 M (0.93269231 0.06730769)  
##              64) texture_mean< 3.36829 96   2 M (0.97916667 0.02083333) *
##              65) texture_mean>=3.36829 8   3 B (0.37500000 0.62500000) *
##            33) smoothness_mean< -2.408892 4   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 4.613791 74  36 M (0.51351351 0.48648649)  
##            34) texture_mean>=2.97527 18   1 M (0.94444444 0.05555556)  
##              68) texture_mean< 3.041522 17   0 M (1.00000000 0.00000000) *
##              69) texture_mean>=3.041522 1   0 B (0.00000000 1.00000000) *
##            35) texture_mean< 2.97527 56  21 B (0.37500000 0.62500000)  
##              70) smoothness_worst>=-1.545556 29   8 M (0.72413793 0.27586207) *
##              71) smoothness_worst< -1.545556 27   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -2.212871 16   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.349943 424 208 B (0.49056604 0.50943396)  
##        10) smoothness_mean>=-2.332582 378 171 M (0.54761905 0.45238095)  
##          20) compactness_se>=-3.027402 41   3 M (0.92682927 0.07317073)  
##            40) compactness_se< -2.455682 38   1 M (0.97368421 0.02631579)  
##              80) smoothness_worst>=-1.554815 31   0 M (1.00000000 0.00000000) *
##              81) smoothness_worst< -1.554815 7   1 M (0.85714286 0.14285714) *
##            41) compactness_se>=-2.455682 3   1 B (0.33333333 0.66666667)  
##              82) texture_mean>=2.915767 1   0 M (1.00000000 0.00000000) *
##              83) texture_mean< 2.915767 2   0 B (0.00000000 1.00000000) *
##          21) compactness_se< -3.027402 337 168 M (0.50148368 0.49851632)  
##            42) smoothness_worst< -1.562856 27   1 M (0.96296296 0.03703704)  
##              84) smoothness_mean< -2.277089 23   0 M (1.00000000 0.00000000) *
##              85) smoothness_mean>=-2.277089 4   1 M (0.75000000 0.25000000) *
##            43) smoothness_worst>=-1.562856 310 143 B (0.46129032 0.53870968)  
##              86) texture_worst>=5.073596 13   0 M (1.00000000 0.00000000) *
##              87) texture_worst< 5.073596 297 130 B (0.43771044 0.56228956) *
##        11) smoothness_mean< -2.332582 46   1 B (0.02173913 0.97826087)  
##          22) symmetry_worst< -2.154356 1   0 M (1.00000000 0.00000000) *
##          23) symmetry_worst>=-2.154356 45   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.416986 290 112 B (0.38620690 0.61379310)  
##       6) compactness_se>=-4.350232 200  94 B (0.47000000 0.53000000)  
##        12) compactness_se< -4.283814 28   1 M (0.96428571 0.03571429)  
##          24) smoothness_worst>=-1.654625 27   0 M (1.00000000 0.00000000) *
##          25) smoothness_worst< -1.654625 1   0 B (0.00000000 1.00000000) *
##        13) compactness_se>=-4.283814 172  67 B (0.38953488 0.61046512)  
##          26) texture_worst>=5.003123 29   6 M (0.79310345 0.20689655)  
##            52) smoothness_mean>=-2.492372 26   3 M (0.88461538 0.11538462)  
##             104) texture_worst< 5.316369 20   0 M (1.00000000 0.00000000) *
##             105) texture_worst>=5.316369 6   3 M (0.50000000 0.50000000) *
##            53) smoothness_mean< -2.492372 3   0 B (0.00000000 1.00000000) *
##          27) texture_worst< 5.003123 143  44 B (0.30769231 0.69230769)  
##            54) smoothness_worst< -1.598711 59  28 M (0.52542373 0.47457627)  
##             108) symmetry_worst>=-2.050132 40  11 M (0.72500000 0.27500000) *
##             109) symmetry_worst< -2.050132 19   2 B (0.10526316 0.89473684) *
##            55) smoothness_worst>=-1.598711 84  13 B (0.15476190 0.84523810)  
##             110) symmetry_worst< -1.995409 6   1 M (0.83333333 0.16666667) *
##             111) symmetry_worst>=-1.995409 78   8 B (0.10256410 0.89743590) *
##       7) compactness_se< -4.350232 90  18 B (0.20000000 0.80000000)  
##        14) texture_mean>=3.124472 21  10 B (0.47619048 0.52380952)  
##          28) texture_mean< 3.17309 10   0 M (1.00000000 0.00000000) *
##          29) texture_mean>=3.17309 11   0 B (0.00000000 1.00000000) *
##        15) texture_mean< 3.124472 69   8 B (0.11594203 0.88405797)  
##          30) symmetry_worst>=-1.658507 26   7 B (0.26923077 0.73076923)  
##            60) smoothness_mean< -2.503847 4   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean>=-2.503847 22   3 B (0.13636364 0.86363636)  
##             122) texture_mean< 2.936149 4   1 M (0.75000000 0.25000000) *
##             123) texture_mean>=2.936149 18   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst< -1.658507 43   1 B (0.02325581 0.97674419)  
##            62) compactness_se< -4.6643 6   1 B (0.16666667 0.83333333)  
##             124) compactness_se>=-4.740419 1   0 M (1.00000000 0.00000000) *
##             125) compactness_se< -4.740419 5   0 B (0.00000000 1.00000000) *
##            63) compactness_se>=-4.6643 37   0 B (0.00000000 1.00000000) *
## 
## $trees[[70]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 444 M (0.51315789 0.48684211)  
##     2) smoothness_mean>=-2.332634 422 163 M (0.61374408 0.38625592)  
##       4) smoothness_mean< -2.31481 70  10 M (0.85714286 0.14285714)  
##         8) texture_mean>=2.849464 56   2 M (0.96428571 0.03571429)  
##          16) compactness_se< -3.515615 42   0 M (1.00000000 0.00000000) *
##          17) compactness_se>=-3.515615 14   2 M (0.85714286 0.14285714)  
##            34) compactness_se>=-3.342347 12   0 M (1.00000000 0.00000000) *
##            35) compactness_se< -3.342347 2   0 B (0.00000000 1.00000000) *
##         9) texture_mean< 2.849464 14   6 B (0.42857143 0.57142857)  
##          18) smoothness_mean>=-2.322851 6   0 M (1.00000000 0.00000000) *
##          19) smoothness_mean< -2.322851 8   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.31481 352 153 M (0.56534091 0.43465909)  
##        10) compactness_se>=-4.222363 312 119 M (0.61858974 0.38141026)  
##          20) smoothness_mean>=-2.303285 292  99 M (0.66095890 0.33904110)  
##            40) symmetry_worst>=-1.775265 181  45 M (0.75138122 0.24861878)  
##              80) texture_mean>=2.777879 159  32 M (0.79874214 0.20125786) *
##              81) texture_mean< 2.777879 22   9 B (0.40909091 0.59090909) *
##            41) symmetry_worst< -1.775265 111  54 M (0.51351351 0.48648649)  
##              82) smoothness_worst< -1.433708 94  38 M (0.59574468 0.40425532) *
##              83) smoothness_worst>=-1.433708 17   1 B (0.05882353 0.94117647) *
##          21) smoothness_mean< -2.303285 20   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -4.222363 40   6 B (0.15000000 0.85000000)  
##          22) smoothness_mean< -2.3007 6   0 M (1.00000000 0.00000000) *
##          23) smoothness_mean>=-2.3007 34   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.332634 490 209 B (0.42653061 0.57346939)  
##       6) smoothness_mean< -2.349943 451 204 B (0.45232816 0.54767184)  
##        12) smoothness_mean>=-2.357834 19   0 M (1.00000000 0.00000000) *
##        13) smoothness_mean< -2.357834 432 185 B (0.42824074 0.57175926)  
##          26) texture_mean>=2.763153 411 185 B (0.45012165 0.54987835)  
##            52) texture_worst< 4.3976 64  21 M (0.67187500 0.32812500)  
##             104) smoothness_worst>=-1.554805 33   2 M (0.93939394 0.06060606) *
##             105) smoothness_worst< -1.554805 31  12 B (0.38709677 0.61290323) *
##            53) texture_worst>=4.3976 347 142 B (0.40922190 0.59077810)  
##             106) smoothness_mean>=-2.396732 78  30 M (0.61538462 0.38461538) *
##             107) smoothness_mean< -2.396732 269  94 B (0.34944238 0.65055762) *
##          27) texture_mean< 2.763153 21   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean>=-2.349943 39   5 B (0.12820513 0.87179487)  
##        14) smoothness_worst>=-1.435092 2   0 M (1.00000000 0.00000000) *
##        15) smoothness_worst< -1.435092 37   3 B (0.08108108 0.91891892)  
##          30) symmetry_worst< -2.189951 3   1 M (0.66666667 0.33333333)  
##            60) texture_mean< 3.025767 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean>=3.025767 1   0 B (0.00000000 1.00000000) *
##          31) symmetry_worst>=-2.189951 34   1 B (0.02941176 0.97058824)  
##            62) symmetry_worst>=-1.41845 7   1 B (0.14285714 0.85714286)  
##             124) texture_mean>=2.986903 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 2.986903 6   0 B (0.00000000 1.00000000) *
##            63) symmetry_worst< -1.41845 27   0 B (0.00000000 1.00000000) *
## 
## $trees[[71]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 446 M (0.51096491 0.48903509)  
##     2) symmetry_worst>=-1.840831 571 247 M (0.56742557 0.43257443)  
##       4) compactness_se>=-3.690481 250  83 M (0.66800000 0.33200000)  
##         8) smoothness_worst< -1.434262 188  48 M (0.74468085 0.25531915)  
##          16) symmetry_worst< -1.128751 167  32 M (0.80838323 0.19161677)  
##            32) smoothness_mean>=-2.503795 163  28 M (0.82822086 0.17177914)  
##              64) texture_worst>=4.56463 93   8 M (0.91397849 0.08602151) *
##              65) texture_worst< 4.56463 70  20 M (0.71428571 0.28571429) *
##            33) smoothness_mean< -2.503795 4   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst>=-1.128751 21   5 B (0.23809524 0.76190476)  
##            34) symmetry_worst>=-1.068249 5   0 M (1.00000000 0.00000000) *
##            35) symmetry_worst< -1.068249 16   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst>=-1.434262 62  27 B (0.43548387 0.56451613)  
##          18) compactness_se< -3.470851 17   2 M (0.88235294 0.11764706)  
##            36) texture_mean>=2.688296 15   0 M (1.00000000 0.00000000) *
##            37) texture_mean< 2.688296 2   0 B (0.00000000 1.00000000) *
##          19) compactness_se>=-3.470851 45  12 B (0.26666667 0.73333333)  
##            38) symmetry_worst>=-1.306254 5   0 M (1.00000000 0.00000000) *
##            39) symmetry_worst< -1.306254 40   7 B (0.17500000 0.82500000)  
##              78) texture_worst< 4.846274 21   7 B (0.33333333 0.66666667) *
##              79) texture_worst>=4.846274 19   0 B (0.00000000 1.00000000) *
##       5) compactness_se< -3.690481 321 157 B (0.48909657 0.51090343)  
##        10) smoothness_worst< -1.576769 53  13 M (0.75471698 0.24528302)  
##          20) smoothness_mean< -2.496118 34   2 M (0.94117647 0.05882353)  
##            40) texture_mean< 3.17207 32   0 M (1.00000000 0.00000000) *
##            41) texture_mean>=3.17207 2   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean>=-2.496118 19   8 B (0.42105263 0.57894737)  
##            42) smoothness_mean>=-2.434747 6   0 M (1.00000000 0.00000000) *
##            43) smoothness_mean< -2.434747 13   2 B (0.15384615 0.84615385)  
##              86) texture_mean>=3.31519 1   0 M (1.00000000 0.00000000) *
##              87) texture_mean< 3.31519 12   1 B (0.08333333 0.91666667) *
##        11) smoothness_worst>=-1.576769 268 117 B (0.43656716 0.56343284)  
##          22) smoothness_worst>=-1.556321 236 117 B (0.49576271 0.50423729)  
##            44) compactness_se< -3.859436 200  89 M (0.55500000 0.44500000)  
##              88) smoothness_mean>=-2.473387 190  79 M (0.58421053 0.41578947) *
##              89) smoothness_mean< -2.473387 10   0 B (0.00000000 1.00000000) *
##            45) compactness_se>=-3.859436 36   6 B (0.16666667 0.83333333)  
##              90) smoothness_worst>=-1.455217 8   2 M (0.75000000 0.25000000) *
##              91) smoothness_worst< -1.455217 28   0 B (0.00000000 1.00000000) *
##          23) smoothness_worst< -1.556321 32   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.840831 341 142 B (0.41642229 0.58357771)  
##       6) symmetry_worst< -1.925345 261 129 B (0.49425287 0.50574713)  
##        12) symmetry_worst>=-2.052205 122  46 M (0.62295082 0.37704918)  
##          24) smoothness_mean>=-2.449526 101  28 M (0.72277228 0.27722772)  
##            48) smoothness_worst< -1.540225 51   1 M (0.98039216 0.01960784)  
##              96) compactness_se< -3.451641 50   0 M (1.00000000 0.00000000) *
##              97) compactness_se>=-3.451641 1   0 B (0.00000000 1.00000000) *
##            49) smoothness_worst>=-1.540225 50  23 B (0.46000000 0.54000000)  
##              98) symmetry_worst< -1.990435 19   4 M (0.78947368 0.21052632) *
##              99) symmetry_worst>=-1.990435 31   8 B (0.25806452 0.74193548) *
##          25) smoothness_mean< -2.449526 21   3 B (0.14285714 0.85714286)  
##            50) smoothness_worst>=-1.503558 3   0 M (1.00000000 0.00000000) *
##            51) smoothness_worst< -1.503558 18   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst< -2.052205 139  53 B (0.38129496 0.61870504)  
##          26) compactness_se>=-3.487878 36  10 M (0.72222222 0.27777778)  
##            52) texture_mean>=3.049609 28   2 M (0.92857143 0.07142857)  
##             104) smoothness_mean>=-2.661875 27   1 M (0.96296296 0.03703704) *
##             105) smoothness_mean< -2.661875 1   0 B (0.00000000 1.00000000) *
##            53) texture_mean< 3.049609 8   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -3.487878 103  27 B (0.26213592 0.73786408)  
##            54) smoothness_mean>=-2.30775 25   6 M (0.76000000 0.24000000)  
##             108) smoothness_worst>=-1.497846 14   0 M (1.00000000 0.00000000) *
##             109) smoothness_worst< -1.497846 11   5 B (0.45454545 0.54545455) *
##            55) smoothness_mean< -2.30775 78   8 B (0.10256410 0.89743590)  
##             110) smoothness_worst>=-1.549837 28   8 B (0.28571429 0.71428571) *
##             111) smoothness_worst< -1.549837 50   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst>=-1.925345 80  13 B (0.16250000 0.83750000)  
##        14) texture_worst>=4.937311 13   3 M (0.76923077 0.23076923)  
##          28) compactness_se>=-4.899363 10   0 M (1.00000000 0.00000000) *
##          29) compactness_se< -4.899363 3   0 B (0.00000000 1.00000000) *
##        15) texture_worst< 4.937311 67   3 B (0.04477612 0.95522388)  
##          30) smoothness_worst>=-1.424105 2   1 M (0.50000000 0.50000000)  
##            60) texture_mean< 2.876957 1   0 M (1.00000000 0.00000000) *
##            61) texture_mean>=2.876957 1   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst< -1.424105 65   2 B (0.03076923 0.96923077)  
##            62) smoothness_mean< -2.386198 14   2 B (0.14285714 0.85714286)  
##             124) smoothness_mean>=-2.406561 2   0 M (1.00000000 0.00000000) *
##             125) smoothness_mean< -2.406561 12   0 B (0.00000000 1.00000000) *
##            63) smoothness_mean>=-2.386198 51   0 B (0.00000000 1.00000000) *
## 
## $trees[[72]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 410 B (0.44956140 0.55043860)  
##     2) texture_mean>=2.892314 675 335 B (0.49629630 0.50370370)  
##       4) symmetry_worst< -2.379234 18   0 M (1.00000000 0.00000000) *
##       5) symmetry_worst>=-2.379234 657 317 B (0.48249619 0.51750381)  
##        10) smoothness_mean< -2.473552 106  36 M (0.66037736 0.33962264)  
##          20) texture_mean>=2.935975 97  27 M (0.72164948 0.27835052)  
##            40) texture_mean< 3.15715 72  11 M (0.84722222 0.15277778)  
##              80) compactness_se< -2.82386 69   8 M (0.88405797 0.11594203) *
##              81) compactness_se>=-2.82386 3   0 B (0.00000000 1.00000000) *
##            41) texture_mean>=3.15715 25   9 B (0.36000000 0.64000000)  
##              82) smoothness_mean>=-2.489159 10   1 M (0.90000000 0.10000000) *
##              83) smoothness_mean< -2.489159 15   0 B (0.00000000 1.00000000) *
##          21) texture_mean< 2.935975 9   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean>=-2.473552 551 247 B (0.44827586 0.55172414)  
##          22) smoothness_worst>=-1.609811 516 246 B (0.47674419 0.52325581)  
##            44) symmetry_worst>=-2.193154 471 234 M (0.50318471 0.49681529)  
##              88) symmetry_worst< -2.115313 18   0 M (1.00000000 0.00000000) *
##              89) symmetry_worst>=-2.115313 453 219 B (0.48344371 0.51655629) *
##            45) symmetry_worst< -2.193154 45   9 B (0.20000000 0.80000000)  
##              90) smoothness_mean< -2.447413 5   0 M (1.00000000 0.00000000) *
##              91) smoothness_mean>=-2.447413 40   4 B (0.10000000 0.90000000) *
##          23) smoothness_worst< -1.609811 35   1 B (0.02857143 0.97142857)  
##            46) smoothness_mean>=-2.337942 1   0 M (1.00000000 0.00000000) *
##            47) smoothness_mean< -2.337942 34   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.892314 237  75 B (0.31645570 0.68354430)  
##       6) compactness_se>=-3.891799 118  53 B (0.44915254 0.55084746)  
##        12) smoothness_worst>=-1.482701 62  24 M (0.61290323 0.38709677)  
##          24) symmetry_worst>=-1.732707 37   7 M (0.81081081 0.18918919)  
##            48) symmetry_worst< -1.395292 20   0 M (1.00000000 0.00000000) *
##            49) symmetry_worst>=-1.395292 17   7 M (0.58823529 0.41176471)  
##              98) symmetry_worst>=-1.281003 11   1 M (0.90909091 0.09090909) *
##              99) symmetry_worst< -1.281003 6   0 B (0.00000000 1.00000000) *
##          25) symmetry_worst< -1.732707 25   8 B (0.32000000 0.68000000)  
##            50) smoothness_worst< -1.478176 8   0 M (1.00000000 0.00000000) *
##            51) smoothness_worst>=-1.478176 17   0 B (0.00000000 1.00000000) *
##        13) smoothness_worst< -1.482701 56  15 B (0.26785714 0.73214286)  
##          26) texture_worst< 3.919786 15   5 M (0.66666667 0.33333333)  
##            52) smoothness_mean>=-2.461945 11   1 M (0.90909091 0.09090909)  
##             104) smoothness_mean< -2.298096 10   0 M (1.00000000 0.00000000) *
##             105) smoothness_mean>=-2.298096 1   0 B (0.00000000 1.00000000) *
##            53) smoothness_mean< -2.461945 4   0 B (0.00000000 1.00000000) *
##          27) texture_worst>=3.919786 41   5 B (0.12195122 0.87804878)  
##            54) texture_mean< 2.717337 6   1 M (0.83333333 0.16666667)  
##             108) symmetry_worst>=-1.940832 5   0 M (1.00000000 0.00000000) *
##             109) symmetry_worst< -1.940832 1   0 B (0.00000000 1.00000000) *
##            55) texture_mean>=2.717337 35   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.891799 119  22 B (0.18487395 0.81512605)  
##        14) compactness_se< -4.159844 54  18 B (0.33333333 0.66666667)  
##          28) compactness_se>=-4.198706 14   2 M (0.85714286 0.14285714)  
##            56) texture_mean>=2.772165 12   0 M (1.00000000 0.00000000) *
##            57) texture_mean< 2.772165 2   0 B (0.00000000 1.00000000) *
##          29) compactness_se< -4.198706 40   6 B (0.15000000 0.85000000)  
##            58) smoothness_worst< -1.546636 7   1 M (0.85714286 0.14285714)  
##             116) texture_mean>=2.85389 6   0 M (1.00000000 0.00000000) *
##             117) texture_mean< 2.85389 1   0 B (0.00000000 1.00000000) *
##            59) smoothness_worst>=-1.546636 33   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.159844 65   4 B (0.06153846 0.93846154)  
##          30) smoothness_worst>=-1.451541 8   4 M (0.50000000 0.50000000)  
##            60) texture_mean>=2.803301 4   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.803301 4   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst< -1.451541 57   0 B (0.00000000 1.00000000) *
## 
## $trees[[73]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 411 M (0.54934211 0.45065789)  
##     2) texture_mean>=2.892591 690 269 M (0.61014493 0.38985507)  
##       4) texture_worst< 4.753106 382 117 M (0.69371728 0.30628272)  
##         8) texture_mean>=3.055881 73   7 M (0.90410959 0.09589041)  
##          16) compactness_se>=-4.572499 70   4 M (0.94285714 0.05714286)  
##            32) smoothness_worst>=-1.606352 54   0 M (1.00000000 0.00000000) *
##            33) smoothness_worst< -1.606352 16   4 M (0.75000000 0.25000000)  
##              66) smoothness_worst< -1.693722 12   0 M (1.00000000 0.00000000) *
##              67) smoothness_worst>=-1.693722 4   0 B (0.00000000 1.00000000) *
##          17) compactness_se< -4.572499 3   0 B (0.00000000 1.00000000) *
##         9) texture_mean< 3.055881 309 110 M (0.64401294 0.35598706)  
##          18) smoothness_worst>=-1.473478 53   5 M (0.90566038 0.09433962)  
##            36) texture_mean>=2.934384 49   2 M (0.95918367 0.04081633)  
##              72) texture_mean< 3.039982 46   0 M (1.00000000 0.00000000) *
##              73) texture_mean>=3.039982 3   1 B (0.33333333 0.66666667) *
##            37) texture_mean< 2.934384 4   1 B (0.25000000 0.75000000)  
##              74) smoothness_mean< -2.240603 1   0 M (1.00000000 0.00000000) *
##              75) smoothness_mean>=-2.240603 3   0 B (0.00000000 1.00000000) *
##          19) smoothness_worst< -1.473478 256 105 M (0.58984375 0.41015625)  
##            38) smoothness_worst< -1.476997 239  88 M (0.63179916 0.36820084)  
##              76) smoothness_mean< -2.234468 204  62 M (0.69607843 0.30392157) *
##              77) smoothness_mean>=-2.234468 35   9 B (0.25714286 0.74285714) *
##            39) smoothness_worst>=-1.476997 17   0 B (0.00000000 1.00000000) *
##       5) texture_worst>=4.753106 308 152 M (0.50649351 0.49350649)  
##        10) texture_worst>=4.818867 254 107 M (0.57874016 0.42125984)  
##          20) smoothness_mean>=-2.462871 198  66 M (0.66666667 0.33333333)  
##            40) symmetry_worst>=-2.207988 184  53 M (0.71195652 0.28804348)  
##              80) smoothness_worst< -1.447185 130  21 M (0.83846154 0.16153846) *
##              81) smoothness_worst>=-1.447185 54  22 B (0.40740741 0.59259259) *
##            41) symmetry_worst< -2.207988 14   1 B (0.07142857 0.92857143)  
##              82) smoothness_mean>=-2.282229 1   0 M (1.00000000 0.00000000) *
##              83) smoothness_mean< -2.282229 13   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean< -2.462871 56  15 B (0.26785714 0.73214286)  
##            42) symmetry_worst< -1.635915 27  12 M (0.55555556 0.44444444)  
##              84) texture_mean< 3.17309 14   2 M (0.85714286 0.14285714) *
##              85) texture_mean>=3.17309 13   3 B (0.23076923 0.76923077) *
##            43) symmetry_worst>=-1.635915 29   0 B (0.00000000 1.00000000) *
##        11) texture_worst< 4.818867 54   9 B (0.16666667 0.83333333)  
##          22) symmetry_worst>=-0.9904278 4   0 M (1.00000000 0.00000000) *
##          23) symmetry_worst< -0.9904278 50   5 B (0.10000000 0.90000000)  
##            46) compactness_se>=-3.322755 4   1 M (0.75000000 0.25000000)  
##              92) smoothness_mean>=-2.522464 3   0 M (1.00000000 0.00000000) *
##              93) smoothness_mean< -2.522464 1   0 B (0.00000000 1.00000000) *
##            47) compactness_se< -3.322755 46   2 B (0.04347826 0.95652174)  
##              94) texture_worst< 4.781945 9   2 B (0.22222222 0.77777778) *
##              95) texture_worst>=4.781945 37   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.892591 222  80 B (0.36036036 0.63963964)  
##       6) smoothness_mean>=-2.31958 117  58 M (0.50427350 0.49572650)  
##        12) texture_mean< 2.844609 71  23 M (0.67605634 0.32394366)  
##          24) texture_worst>=4.1745 40   6 M (0.85000000 0.15000000)  
##            48) symmetry_worst>=-1.987693 38   4 M (0.89473684 0.10526316)  
##              96) smoothness_mean< -2.238735 22   0 M (1.00000000 0.00000000) *
##              97) smoothness_mean>=-2.238735 16   4 M (0.75000000 0.25000000) *
##            49) symmetry_worst< -1.987693 2   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 4.1745 31  14 B (0.45161290 0.54838710)  
##            50) smoothness_mean< -2.298594 7   0 M (1.00000000 0.00000000) *
##            51) smoothness_mean>=-2.298594 24   7 B (0.29166667 0.70833333)  
##             102) symmetry_worst>=-1.612049 10   3 M (0.70000000 0.30000000) *
##             103) symmetry_worst< -1.612049 14   0 B (0.00000000 1.00000000) *
##        13) texture_mean>=2.844609 46  11 B (0.23913043 0.76086957)  
##          26) texture_worst>=4.669441 4   0 M (1.00000000 0.00000000) *
##          27) texture_worst< 4.669441 42   7 B (0.16666667 0.83333333)  
##            54) texture_worst< 4.361241 11   4 M (0.63636364 0.36363636)  
##             108) texture_mean>=2.857891 8   1 M (0.87500000 0.12500000) *
##             109) texture_mean< 2.857891 3   0 B (0.00000000 1.00000000) *
##            55) texture_worst>=4.361241 31   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean< -2.31958 105  21 B (0.20000000 0.80000000)  
##        14) compactness_se< -4.31315 21  10 M (0.52380952 0.47619048)  
##          28) texture_mean>=2.871852 8   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.871852 13   3 B (0.23076923 0.76923077)  
##            58) smoothness_mean>=-2.372437 5   2 M (0.60000000 0.40000000)  
##             116) texture_mean>=2.800736 3   0 M (1.00000000 0.00000000) *
##             117) texture_mean< 2.800736 2   0 B (0.00000000 1.00000000) *
##            59) smoothness_mean< -2.372437 8   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.31315 84  10 B (0.11904762 0.88095238)  
##          30) smoothness_worst>=-1.452493 4   1 M (0.75000000 0.25000000)  
##            60) texture_mean>=2.597803 3   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.597803 1   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst< -1.452493 80   7 B (0.08750000 0.91250000)  
##            62) compactness_se>=-3.488718 22   7 B (0.31818182 0.68181818)  
##             124) compactness_se< -3.483667 4   0 M (1.00000000 0.00000000) *
##             125) compactness_se>=-3.483667 18   3 B (0.16666667 0.83333333) *
##            63) compactness_se< -3.488718 58   0 B (0.00000000 1.00000000) *
## 
## $trees[[74]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 447 B (0.49013158 0.50986842)  
##     2) texture_mean>=2.963467 511 206 M (0.59686888 0.40313112)  
##       4) smoothness_worst>=-1.579228 402 137 M (0.65920398 0.34079602)  
##         8) smoothness_mean>=-2.501158 393 128 M (0.67430025 0.32569975)  
##          16) smoothness_worst< -1.568787 27   0 M (1.00000000 0.00000000) *
##          17) smoothness_worst>=-1.568787 366 128 M (0.65027322 0.34972678)  
##            34) smoothness_worst>=-1.561324 355 118 M (0.66760563 0.33239437)  
##              68) smoothness_worst< -1.532606 55   6 M (0.89090909 0.10909091) *
##              69) smoothness_worst>=-1.532606 300 112 M (0.62666667 0.37333333) *
##            35) smoothness_worst< -1.561324 11   1 B (0.09090909 0.90909091)  
##              70) compactness_se>=-2.682598 1   0 M (1.00000000 0.00000000) *
##              71) compactness_se< -2.682598 10   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.501158 9   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.579228 109  40 B (0.36697248 0.63302752)  
##        10) symmetry_worst>=-1.693879 35   9 M (0.74285714 0.25714286)  
##          20) compactness_se< -3.885202 21   0 M (1.00000000 0.00000000) *
##          21) compactness_se>=-3.885202 14   5 B (0.35714286 0.64285714)  
##            42) compactness_se>=-3.153142 5   0 M (1.00000000 0.00000000) *
##            43) compactness_se< -3.153142 9   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -1.693879 74  14 B (0.18918919 0.81081081)  
##          22) symmetry_worst< -2.081905 19   9 M (0.52631579 0.47368421)  
##            44) compactness_se>=-3.424051 10   2 M (0.80000000 0.20000000)  
##              88) smoothness_mean>=-2.638103 8   0 M (1.00000000 0.00000000) *
##              89) smoothness_mean< -2.638103 2   0 B (0.00000000 1.00000000) *
##            45) compactness_se< -3.424051 9   2 B (0.22222222 0.77777778)  
##              90) smoothness_mean>=-2.373466 2   0 M (1.00000000 0.00000000) *
##              91) smoothness_mean< -2.373466 7   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst>=-2.081905 55   4 B (0.07272727 0.92727273)  
##            46) texture_mean< 2.969886 2   0 M (1.00000000 0.00000000) *
##            47) texture_mean>=2.969886 53   2 B (0.03773585 0.96226415)  
##              94) texture_mean>=3.172196 5   2 B (0.40000000 0.60000000) *
##              95) texture_mean< 3.172196 48   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.963467 401 142 B (0.35411471 0.64588529)  
##       6) symmetry_worst>=-1.325507 22   3 M (0.86363636 0.13636364)  
##        12) smoothness_mean>=-2.340715 20   1 M (0.95000000 0.05000000)  
##          24) compactness_se< -2.588521 19   0 M (1.00000000 0.00000000) *
##          25) compactness_se>=-2.588521 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.340715 2   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.325507 379 123 B (0.32453826 0.67546174)  
##        14) compactness_se< -3.426516 323 120 B (0.37151703 0.62848297)  
##          28) compactness_se>=-3.764682 83  34 M (0.59036145 0.40963855)  
##            56) symmetry_worst>=-1.813857 50  11 M (0.78000000 0.22000000)  
##             112) texture_worst>=4.256309 30   0 M (1.00000000 0.00000000) *
##             113) texture_worst< 4.256309 20   9 B (0.45000000 0.55000000) *
##            57) symmetry_worst< -1.813857 33  10 B (0.30303030 0.69696970)  
##             114) texture_worst< 4.000974 9   2 M (0.77777778 0.22222222) *
##             115) texture_worst>=4.000974 24   3 B (0.12500000 0.87500000) *
##          29) compactness_se< -3.764682 240  71 B (0.29583333 0.70416667)  
##            58) smoothness_mean>=-2.391331 150  57 B (0.38000000 0.62000000)  
##             116) texture_worst>=4.389172 100  48 M (0.52000000 0.48000000) *
##             117) texture_worst< 4.389172 50   5 B (0.10000000 0.90000000) *
##            59) smoothness_mean< -2.391331 90  14 B (0.15555556 0.84444444)  
##             118) texture_worst< 4.411124 16   4 M (0.75000000 0.25000000) *
##             119) texture_worst>=4.411124 74   2 B (0.02702703 0.97297297) *
##        15) compactness_se>=-3.426516 56   3 B (0.05357143 0.94642857)  
##          30) smoothness_mean>=-2.154617 4   2 M (0.50000000 0.50000000)  
##            60) texture_mean>=2.720927 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.720927 2   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean< -2.154617 52   1 B (0.01923077 0.98076923)  
##            62) symmetry_worst< -1.783471 6   1 B (0.16666667 0.83333333)  
##             124) texture_mean>=2.902347 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 2.902347 5   0 B (0.00000000 1.00000000) *
##            63) symmetry_worst>=-1.783471 46   0 B (0.00000000 1.00000000) *
## 
## $trees[[75]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 390 B (0.4276316 0.5723684)  
##     2) texture_worst>=5.402766 22   0 M (1.0000000 0.0000000) *
##     3) texture_worst< 5.402766 890 368 B (0.4134831 0.5865169)  
##       6) smoothness_mean>=-2.173316 62  18 M (0.7096774 0.2903226)  
##        12) smoothness_worst< -1.409741 29   0 M (1.0000000 0.0000000) *
##        13) smoothness_worst>=-1.409741 33  15 B (0.4545455 0.5454545)  
##          26) symmetry_worst>=-1.656121 22   7 M (0.6818182 0.3181818)  
##            52) texture_mean>=2.688296 15   0 M (1.0000000 0.0000000) *
##            53) texture_mean< 2.688296 7   0 B (0.0000000 1.0000000) *
##          27) symmetry_worst< -1.656121 11   0 B (0.0000000 1.0000000) *
##       7) smoothness_mean< -2.173316 828 324 B (0.3913043 0.6086957)  
##        14) symmetry_worst>=-1.001713 11   0 M (1.0000000 0.0000000) *
##        15) symmetry_worst< -1.001713 817 313 B (0.3831089 0.6168911)  
##          30) compactness_se>=-4.505325 733 299 B (0.4079127 0.5920873)  
##            60) compactness_se< -4.49319 14   0 M (1.0000000 0.0000000) *
##            61) compactness_se>=-4.49319 719 285 B (0.3963839 0.6036161)  
##             122) texture_worst>=4.905415 138  59 M (0.5724638 0.4275362) *
##             123) texture_worst< 4.905415 581 206 B (0.3545611 0.6454389) *
##          31) compactness_se< -4.505325 84  14 B (0.1666667 0.8333333)  
##            62) symmetry_worst< -2.374205 6   0 M (1.0000000 0.0000000) *
##            63) symmetry_worst>=-2.374205 78   8 B (0.1025641 0.8974359)  
##             126) smoothness_mean< -2.449246 28   8 B (0.2857143 0.7142857) *
##             127) smoothness_mean>=-2.449246 50   0 B (0.0000000 1.0000000) *
## 
## $trees[[76]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 438 B (0.48026316 0.51973684)  
##    2) compactness_se>=-4.694501 893 438 B (0.49048152 0.50951848)  
##      4) texture_worst>=4.275472 782 381 M (0.51278772 0.48721228)  
##        8) smoothness_mean>=-2.201842 46  10 M (0.78260870 0.21739130)  
##         16) smoothness_mean< -2.093138 40   5 M (0.87500000 0.12500000)  
##           32) compactness_se>=-4.24572 38   3 M (0.92105263 0.07894737)  
##             64) texture_mean>=2.918531 31   0 M (1.00000000 0.00000000) *
##             65) texture_mean< 2.918531 7   3 M (0.57142857 0.42857143) *
##           33) compactness_se< -4.24572 2   0 B (0.00000000 1.00000000) *
##         17) smoothness_mean>=-2.093138 6   1 B (0.16666667 0.83333333)  
##           34) texture_mean< 2.894137 1   0 M (1.00000000 0.00000000) *
##           35) texture_mean>=2.894137 5   0 B (0.00000000 1.00000000) *
##        9) smoothness_mean< -2.201842 736 365 B (0.49592391 0.50407609)  
##         18) smoothness_mean< -2.235394 690 332 M (0.51884058 0.48115942)  
##           36) texture_worst< 4.550789 191  60 M (0.68586387 0.31413613)  
##             72) compactness_se< -2.751692 180  49 M (0.72777778 0.27222222) *
##             73) compactness_se>=-2.751692 11   0 B (0.00000000 1.00000000) *
##           37) texture_worst>=4.550789 499 227 B (0.45490982 0.54509018)  
##             74) texture_worst>=4.572846 457 223 B (0.48796499 0.51203501) *
##             75) texture_worst< 4.572846 42   4 B (0.09523810 0.90476190) *
##         19) smoothness_mean>=-2.235394 46   7 B (0.15217391 0.84782609)  
##           38) texture_mean>=3.04949 4   0 M (1.00000000 0.00000000) *
##           39) texture_mean< 3.04949 42   3 B (0.07142857 0.92857143)  
##             78) texture_worst< 4.329277 1   0 M (1.00000000 0.00000000) *
##             79) texture_worst>=4.329277 41   2 B (0.04878049 0.95121951) *
##      5) texture_worst< 4.275472 111  37 B (0.33333333 0.66666667)  
##       10) texture_worst< 4.18243 70  34 B (0.48571429 0.51428571)  
##         20) compactness_se>=-3.97985 59  25 M (0.57627119 0.42372881)  
##           40) compactness_se< -3.48221 41  12 M (0.70731707 0.29268293)  
##             80) smoothness_mean>=-2.466148 37   8 M (0.78378378 0.21621622) *
##             81) smoothness_mean< -2.466148 4   0 B (0.00000000 1.00000000) *
##           41) compactness_se>=-3.48221 18   5 B (0.27777778 0.72222222)  
##             82) smoothness_worst>=-1.490036 5   0 M (1.00000000 0.00000000) *
##             83) smoothness_worst< -1.490036 13   0 B (0.00000000 1.00000000) *
##         21) compactness_se< -3.97985 11   0 B (0.00000000 1.00000000) *
##       11) texture_worst>=4.18243 41   3 B (0.07317073 0.92682927)  
##         22) texture_mean< 2.715026 3   1 M (0.66666667 0.33333333)  
##           44) texture_mean>=2.710705 2   0 M (1.00000000 0.00000000) *
##           45) texture_mean< 2.710705 1   0 B (0.00000000 1.00000000) *
##         23) texture_mean>=2.715026 38   1 B (0.02631579 0.97368421)  
##           46) smoothness_mean>=-2.15202 1   0 M (1.00000000 0.00000000) *
##           47) smoothness_mean< -2.15202 37   0 B (0.00000000 1.00000000) *
##    3) compactness_se< -4.694501 19   0 B (0.00000000 1.00000000) *
## 
## $trees[[77]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 440 M (0.51754386 0.48245614)  
##     2) compactness_se>=-4.198706 699 310 M (0.55650930 0.44349070)  
##       4) texture_worst>=4.481821 528 209 M (0.60416667 0.39583333)  
##         8) symmetry_worst>=-1.43353 43   4 M (0.90697674 0.09302326)  
##          16) texture_mean< 3.110611 37   0 M (1.00000000 0.00000000) *
##          17) texture_mean>=3.110611 6   2 B (0.33333333 0.66666667)  
##            34) texture_mean>=3.146047 2   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.146047 4   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -1.43353 485 205 M (0.57731959 0.42268041)  
##          18) compactness_se< -4.116284 38   3 M (0.92105263 0.07894737)  
##            36) smoothness_mean>=-2.434347 35   0 M (1.00000000 0.00000000) *
##            37) smoothness_mean< -2.434347 3   0 B (0.00000000 1.00000000) *
##          19) compactness_se>=-4.116284 447 202 M (0.54809843 0.45190157)  
##            38) texture_mean>=2.892591 426 183 M (0.57042254 0.42957746)  
##              76) compactness_se< -4.094455 19   0 M (1.00000000 0.00000000) *
##              77) compactness_se>=-4.094455 407 183 M (0.55036855 0.44963145) *
##            39) texture_mean< 2.892591 21   2 B (0.09523810 0.90476190)  
##              78) smoothness_worst>=-1.459092 2   0 M (1.00000000 0.00000000) *
##              79) smoothness_worst< -1.459092 19   0 B (0.00000000 1.00000000) *
##       5) texture_worst< 4.481821 171  70 B (0.40935673 0.59064327)  
##        10) texture_mean< 2.760642 60  21 M (0.65000000 0.35000000)  
##          20) smoothness_mean>=-2.360495 50  11 M (0.78000000 0.22000000)  
##            40) compactness_se>=-3.943187 47   8 M (0.82978723 0.17021277)  
##              80) symmetry_worst< -1.461208 42   5 M (0.88095238 0.11904762) *
##              81) symmetry_worst>=-1.461208 5   2 B (0.40000000 0.60000000) *
##            41) compactness_se< -3.943187 3   0 B (0.00000000 1.00000000) *
##          21) smoothness_mean< -2.360495 10   0 B (0.00000000 1.00000000) *
##        11) texture_mean>=2.760642 111  31 B (0.27927928 0.72072072)  
##          22) compactness_se< -4.160164 8   0 M (1.00000000 0.00000000) *
##          23) compactness_se>=-4.160164 103  23 B (0.22330097 0.77669903)  
##            46) texture_worst< 4.034664 5   0 M (1.00000000 0.00000000) *
##            47) texture_worst>=4.034664 98  18 B (0.18367347 0.81632653)  
##              94) compactness_se>=-3.294139 11   4 M (0.63636364 0.36363636) *
##              95) compactness_se< -3.294139 87  11 B (0.12643678 0.87356322) *
##     3) compactness_se< -4.198706 213  83 B (0.38967136 0.61032864)  
##       6) smoothness_mean< -2.3007 176  81 B (0.46022727 0.53977273)  
##        12) texture_mean< 3.217018 147  69 M (0.53061224 0.46938776)  
##          24) texture_mean>=2.960617 85  26 M (0.69411765 0.30588235)  
##            48) texture_worst< 4.984637 60  10 M (0.83333333 0.16666667)  
##              96) symmetry_worst>=-2.046832 56   6 M (0.89285714 0.10714286) *
##              97) symmetry_worst< -2.046832 4   0 B (0.00000000 1.00000000) *
##            49) texture_worst>=4.984637 25   9 B (0.36000000 0.64000000)  
##              98) texture_mean>=3.12836 9   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 3.12836 16   0 B (0.00000000 1.00000000) *
##          25) texture_mean< 2.960617 62  19 B (0.30645161 0.69354839)  
##            50) compactness_se< -4.327955 41  19 B (0.46341463 0.53658537)  
##             100) compactness_se>=-4.356557 8   0 M (1.00000000 0.00000000) *
##             101) compactness_se< -4.356557 33  11 B (0.33333333 0.66666667) *
##            51) compactness_se>=-4.327955 21   0 B (0.00000000 1.00000000) *
##        13) texture_mean>=3.217018 29   3 B (0.10344828 0.89655172)  
##          26) texture_mean>=3.388429 3   0 M (1.00000000 0.00000000) *
##          27) texture_mean< 3.388429 26   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean>=-2.3007 37   2 B (0.05405405 0.94594595)  
##        14) smoothness_worst>=-1.435212 2   0 M (1.00000000 0.00000000) *
##        15) smoothness_worst< -1.435212 35   0 B (0.00000000 1.00000000) *
## 
## $trees[[78]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 393 B (0.43092105 0.56907895)  
##    2) symmetry_worst>=-2.20425 838 378 B (0.45107399 0.54892601)  
##      4) smoothness_mean>=-2.423454 584 289 B (0.49486301 0.50513699)  
##        8) symmetry_worst< -1.925345 103  27 M (0.73786408 0.26213592)  
##         16) symmetry_worst>=-1.966829 41   2 M (0.95121951 0.04878049)  
##           32) texture_mean>=2.753964 40   1 M (0.97500000 0.02500000)  
##             64) smoothness_mean< -2.225218 39   0 M (1.00000000 0.00000000) *
##             65) smoothness_mean>=-2.225218 1   0 B (0.00000000 1.00000000) *
##           33) texture_mean< 2.753964 1   0 B (0.00000000 1.00000000) *
##         17) symmetry_worst< -1.966829 62  25 M (0.59677419 0.40322581)  
##           34) texture_worst>=4.85229 18   0 M (1.00000000 0.00000000) *
##           35) texture_worst< 4.85229 44  19 B (0.43181818 0.56818182)  
##             70) smoothness_worst< -1.471555 34  15 M (0.55882353 0.44117647) *
##             71) smoothness_worst>=-1.471555 10   0 B (0.00000000 1.00000000) *
##        9) symmetry_worst>=-1.925345 481 213 B (0.44282744 0.55717256)  
##         18) symmetry_worst>=-1.839419 421 205 B (0.48693587 0.51306413)  
##           36) compactness_se>=-3.66733 169  63 M (0.62721893 0.37278107)  
##             72) compactness_se< -3.494301 45   4 M (0.91111111 0.08888889) *
##             73) compactness_se>=-3.494301 124  59 M (0.52419355 0.47580645) *
##           37) compactness_se< -3.66733 252  99 B (0.39285714 0.60714286)  
##             74) symmetry_worst< -1.749307 60  24 M (0.60000000 0.40000000) *
##             75) symmetry_worst>=-1.749307 192  63 B (0.32812500 0.67187500) *
##         19) symmetry_worst< -1.839419 60   8 B (0.13333333 0.86666667)  
##           38) texture_worst>=4.927821 4   0 M (1.00000000 0.00000000) *
##           39) texture_worst< 4.927821 56   4 B (0.07142857 0.92857143)  
##             78) smoothness_mean< -2.352223 8   4 M (0.50000000 0.50000000) *
##             79) smoothness_mean>=-2.352223 48   0 B (0.00000000 1.00000000) *
##      5) smoothness_mean< -2.423454 254  89 B (0.35039370 0.64960630)  
##       10) smoothness_mean< -2.454106 161  76 B (0.47204969 0.52795031)  
##         20) symmetry_worst>=-1.54778 27   2 M (0.92592593 0.07407407)  
##           40) texture_mean>=2.91613 25   0 M (1.00000000 0.00000000) *
##           41) texture_mean< 2.91613 2   0 B (0.00000000 1.00000000) *
##         21) symmetry_worst< -1.54778 134  51 B (0.38059701 0.61940299)  
##           42) symmetry_worst< -1.617873 111  51 B (0.45945946 0.54054054)  
##             84) symmetry_worst>=-1.844742 63  20 M (0.68253968 0.31746032) *
##             85) symmetry_worst< -1.844742 48   8 B (0.16666667 0.83333333) *
##           43) symmetry_worst>=-1.617873 23   0 B (0.00000000 1.00000000) *
##       11) smoothness_mean>=-2.454106 93  13 B (0.13978495 0.86021505)  
##         22) texture_worst< 4.536807 15   7 M (0.53333333 0.46666667)  
##           44) smoothness_worst< -1.595541 8   0 M (1.00000000 0.00000000) *
##           45) smoothness_worst>=-1.595541 7   0 B (0.00000000 1.00000000) *
##         23) texture_worst>=4.536807 78   5 B (0.06410256 0.93589744)  
##           46) symmetry_worst< -1.993222 13   5 B (0.38461538 0.61538462)  
##             92) smoothness_worst>=-1.525709 4   0 M (1.00000000 0.00000000) *
##             93) smoothness_worst< -1.525709 9   1 B (0.11111111 0.88888889) *
##           47) symmetry_worst>=-1.993222 65   0 B (0.00000000 1.00000000) *
##    3) symmetry_worst< -2.20425 74  15 B (0.20270270 0.79729730)  
##      6) compactness_se>=-3.487878 16   6 M (0.62500000 0.37500000)  
##       12) compactness_se< -3.248462 11   1 M (0.90909091 0.09090909)  
##         24) texture_mean>=2.822066 10   0 M (1.00000000 0.00000000) *
##         25) texture_mean< 2.822066 1   0 B (0.00000000 1.00000000) *
##       13) compactness_se>=-3.248462 5   0 B (0.00000000 1.00000000) *
##      7) compactness_se< -3.487878 58   5 B (0.08620690 0.91379310)  
##       14) compactness_se< -4.480041 8   3 M (0.62500000 0.37500000)  
##         28) smoothness_mean< -2.271294 5   0 M (1.00000000 0.00000000) *
##         29) smoothness_mean>=-2.271294 3   0 B (0.00000000 1.00000000) *
##       15) compactness_se>=-4.480041 50   0 B (0.00000000 1.00000000) *
## 
## $trees[[79]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 423 B (0.46381579 0.53618421)  
##     2) texture_mean>=2.963467 526 247 M (0.53041825 0.46958175)  
##       4) symmetry_worst>=-1.067772 13   0 M (1.00000000 0.00000000) *
##       5) symmetry_worst< -1.067772 513 247 M (0.51851852 0.48148148)  
##        10) smoothness_mean< -2.093138 500 234 M (0.53200000 0.46800000)  
##          20) compactness_se>=-4.706178 489 223 M (0.54396728 0.45603272)  
##            40) symmetry_worst< -1.132261 478 212 M (0.55648536 0.44351464)  
##              80) symmetry_worst>=-1.41845 19   0 M (1.00000000 0.00000000) *
##              81) symmetry_worst< -1.41845 459 212 M (0.53812636 0.46187364) *
##            41) symmetry_worst>=-1.132261 11   0 B (0.00000000 1.00000000) *
##          21) compactness_se< -4.706178 11   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean>=-2.093138 13   0 B (0.00000000 1.00000000) *
##     3) texture_mean< 2.963467 386 144 B (0.37305699 0.62694301)  
##       6) smoothness_mean>=-2.333148 197  93 B (0.47208122 0.52791878)  
##        12) smoothness_worst< -1.477389 111  46 M (0.58558559 0.41441441)  
##          24) smoothness_worst>=-1.482701 27   1 M (0.96296296 0.03703704)  
##            48) texture_mean< 2.893521 26   0 M (1.00000000 0.00000000) *
##            49) texture_mean>=2.893521 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst< -1.482701 84  39 B (0.46428571 0.53571429)  
##            50) smoothness_worst< -1.567043 14   1 M (0.92857143 0.07142857)  
##             100) smoothness_mean>=-2.310275 13   0 M (1.00000000 0.00000000) *
##             101) smoothness_mean< -2.310275 1   0 B (0.00000000 1.00000000) *
##            51) smoothness_worst>=-1.567043 70  26 B (0.37142857 0.62857143)  
##             102) texture_worst>=4.522453 29  11 M (0.62068966 0.37931034) *
##             103) texture_worst< 4.522453 41   8 B (0.19512195 0.80487805) *
##        13) smoothness_worst>=-1.477389 86  28 B (0.32558140 0.67441860)  
##          26) texture_mean>=2.934384 13   1 M (0.92307692 0.07692308)  
##            52) texture_worst< 4.599229 12   0 M (1.00000000 0.00000000) *
##            53) texture_worst>=4.599229 1   0 B (0.00000000 1.00000000) *
##          27) texture_mean< 2.934384 73  16 B (0.21917808 0.78082192)  
##            54) symmetry_worst>=-1.36527 15   6 M (0.60000000 0.40000000)  
##             108) texture_mean>=2.706904 10   1 M (0.90000000 0.10000000) *
##             109) texture_mean< 2.706904 5   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst< -1.36527 58   7 B (0.12068966 0.87931034)  
##             110) texture_mean< 2.515298 3   0 M (1.00000000 0.00000000) *
##             111) texture_mean>=2.515298 55   4 B (0.07272727 0.92727273) *
##       7) smoothness_mean< -2.333148 189  51 B (0.26984127 0.73015873)  
##        14) compactness_se< -4.650552 18   3 M (0.83333333 0.16666667)  
##          28) smoothness_mean< -2.441817 15   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-2.441817 3   0 B (0.00000000 1.00000000) *
##        15) compactness_se>=-4.650552 171  36 B (0.21052632 0.78947368)  
##          30) smoothness_worst>=-1.472307 22  10 M (0.54545455 0.45454545)  
##            60) symmetry_worst>=-1.64088 13   2 M (0.84615385 0.15384615)  
##             120) texture_mean>=2.735974 11   0 M (1.00000000 0.00000000) *
##             121) texture_mean< 2.735974 2   0 B (0.00000000 1.00000000) *
##            61) symmetry_worst< -1.64088 9   1 B (0.11111111 0.88888889)  
##             122) smoothness_mean>=-2.363458 1   0 M (1.00000000 0.00000000) *
##             123) smoothness_mean< -2.363458 8   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst< -1.472307 149  24 B (0.16107383 0.83892617)  
##            62) symmetry_worst< -1.692331 91  24 B (0.26373626 0.73626374)  
##             124) symmetry_worst>=-1.815934 40  20 M (0.50000000 0.50000000) *
##             125) symmetry_worst< -1.815934 51   4 B (0.07843137 0.92156863) *
##            63) symmetry_worst>=-1.692331 58   0 B (0.00000000 1.00000000) *
## 
## $trees[[80]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 367 B (0.40241228 0.59758772)  
##     2) symmetry_worst>=-1.322543 38  10 M (0.73684211 0.26315789)  
##       4) compactness_se< -2.646661 34   6 M (0.82352941 0.17647059)  
##         8) smoothness_worst>=-1.497484 27   1 M (0.96296296 0.03703704)  
##          16) texture_mean>=2.644674 26   0 M (1.00000000 0.00000000) *
##          17) texture_mean< 2.644674 1   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.497484 7   2 B (0.28571429 0.71428571)  
##          18) texture_mean>=3.126045 2   0 M (1.00000000 0.00000000) *
##          19) texture_mean< 3.126045 5   0 B (0.00000000 1.00000000) *
##       5) compactness_se>=-2.646661 4   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.322543 874 339 B (0.38787185 0.61212815)  
##       6) compactness_se< -4.098353 231 112 B (0.48484848 0.51515152)  
##        12) smoothness_mean< -2.291157 203  94 M (0.53694581 0.46305419)  
##          24) smoothness_mean>=-2.368246 45   8 M (0.82222222 0.17777778)  
##            48) symmetry_worst< -1.476085 41   4 M (0.90243902 0.09756098)  
##              96) smoothness_mean< -2.299097 32   0 M (1.00000000 0.00000000) *
##              97) smoothness_mean>=-2.299097 9   4 M (0.55555556 0.44444444) *
##            49) symmetry_worst>=-1.476085 4   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean< -2.368246 158  72 B (0.45569620 0.54430380)  
##            50) texture_worst< 4.984637 132  62 M (0.53030303 0.46969697)  
##             100) texture_mean>=2.976294 49   9 M (0.81632653 0.18367347) *
##             101) texture_mean< 2.976294 83  30 B (0.36144578 0.63855422) *
##            51) texture_worst>=4.984637 26   2 B (0.07692308 0.92307692)  
##             102) compactness_se>=-4.265617 1   0 M (1.00000000 0.00000000) *
##             103) compactness_se< -4.265617 25   1 B (0.04000000 0.96000000) *
##        13) smoothness_mean>=-2.291157 28   3 B (0.10714286 0.89285714)  
##          26) texture_worst>=4.59101 6   3 M (0.50000000 0.50000000)  
##            52) smoothness_mean>=-2.22149 3   0 M (1.00000000 0.00000000) *
##            53) smoothness_mean< -2.22149 3   0 B (0.00000000 1.00000000) *
##          27) texture_worst< 4.59101 22   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-4.098353 643 227 B (0.35303266 0.64696734)  
##        14) compactness_se>=-4.022675 581 223 B (0.38382100 0.61617900)  
##          28) smoothness_worst>=-1.499656 248 119 M (0.52016129 0.47983871)  
##            56) smoothness_mean>=-2.288684 135  45 M (0.66666667 0.33333333)  
##             112) texture_mean>=2.920399 94  17 M (0.81914894 0.18085106) *
##             113) texture_mean< 2.920399 41  13 B (0.31707317 0.68292683) *
##            57) smoothness_mean< -2.288684 113  39 B (0.34513274 0.65486726)  
##             114) symmetry_worst>=-1.431268 9   0 M (1.00000000 0.00000000) *
##             115) symmetry_worst< -1.431268 104  30 B (0.28846154 0.71153846) *
##          29) smoothness_worst< -1.499656 333  94 B (0.28228228 0.71771772)  
##            58) smoothness_worst< -1.515751 263  91 B (0.34600760 0.65399240)  
##             116) compactness_se>=-3.738233 190  82 B (0.43157895 0.56842105) *
##             117) compactness_se< -3.738233 73   9 B (0.12328767 0.87671233) *
##            59) smoothness_worst>=-1.515751 70   3 B (0.04285714 0.95714286)  
##             118) compactness_se< -3.450179 17   3 B (0.17647059 0.82352941) *
##             119) compactness_se>=-3.450179 53   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -4.022675 62   4 B (0.06451613 0.93548387)  
##          30) texture_mean>=3.112668 4   0 M (1.00000000 0.00000000) *
##          31) texture_mean< 3.112668 58   0 B (0.00000000 1.00000000) *
## 
## $trees[[81]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 402 B (0.44078947 0.55921053)  
##     2) compactness_se>=-3.812868 475 237 B (0.49894737 0.50105263)  
##       4) smoothness_worst< -1.400053 416 193 M (0.53605769 0.46394231)  
##         8) smoothness_worst>=-1.450407 25   1 M (0.96000000 0.04000000)  
##          16) smoothness_mean>=-2.420336 24   0 M (1.00000000 0.00000000) *
##          17) smoothness_mean< -2.420336 1   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.450407 391 192 M (0.50895141 0.49104859)  
##          18) smoothness_worst< -1.532606 164  63 M (0.61585366 0.38414634)  
##            36) smoothness_worst>=-1.586874 81  18 M (0.77777778 0.22222222)  
##              72) texture_mean< 3.168177 68   9 M (0.86764706 0.13235294) *
##              73) texture_mean>=3.168177 13   4 B (0.30769231 0.69230769) *
##            37) smoothness_worst< -1.586874 83  38 B (0.45783133 0.54216867)  
##              74) smoothness_worst< -1.59459 68  30 M (0.55882353 0.44117647) *
##              75) smoothness_worst>=-1.59459 15   0 B (0.00000000 1.00000000) *
##          19) smoothness_worst>=-1.532606 227  98 B (0.43171806 0.56828194)  
##            38) smoothness_mean< -2.453563 18   0 M (1.00000000 0.00000000) *
##            39) smoothness_mean>=-2.453563 209  80 B (0.38277512 0.61722488)  
##              78) smoothness_mean>=-2.367658 161  76 B (0.47204969 0.52795031) *
##              79) smoothness_mean< -2.367658 48   4 B (0.08333333 0.91666667) *
##       5) smoothness_worst>=-1.400053 59  14 B (0.23728814 0.76271186)  
##        10) smoothness_worst>=-1.395608 31  14 B (0.45161290 0.54838710)  
##          20) smoothness_mean< -2.194024 8   0 M (1.00000000 0.00000000) *
##          21) smoothness_mean>=-2.194024 23   6 B (0.26086957 0.73913043)  
##            42) symmetry_worst>=-1.596878 9   3 M (0.66666667 0.33333333)  
##              84) texture_mean>=2.688296 6   0 M (1.00000000 0.00000000) *
##              85) texture_mean< 2.688296 3   0 B (0.00000000 1.00000000) *
##            43) symmetry_worst< -1.596878 14   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.395608 28   0 B (0.00000000 1.00000000) *
##     3) compactness_se< -3.812868 437 165 B (0.37757437 0.62242563)  
##       6) compactness_se< -3.867535 394 163 B (0.41370558 0.58629442)  
##        12) compactness_se>=-3.883925 24   1 M (0.95833333 0.04166667)  
##          24) texture_mean>=2.689116 23   0 M (1.00000000 0.00000000) *
##          25) texture_mean< 2.689116 1   0 B (0.00000000 1.00000000) *
##        13) compactness_se< -3.883925 370 140 B (0.37837838 0.62162162)  
##          26) texture_mean>=2.803607 341 140 B (0.41055718 0.58944282)  
##            52) compactness_se< -3.935037 300 135 B (0.45000000 0.55000000)  
##             104) compactness_se>=-3.977364 18   1 M (0.94444444 0.05555556) *
##             105) compactness_se< -3.977364 282 118 B (0.41843972 0.58156028) *
##            53) compactness_se>=-3.935037 41   5 B (0.12195122 0.87804878)  
##             106) smoothness_mean>=-2.240561 4   0 M (1.00000000 0.00000000) *
##             107) smoothness_mean< -2.240561 37   1 B (0.02702703 0.97297297) *
##          27) texture_mean< 2.803607 29   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-3.867535 43   2 B (0.04651163 0.95348837)  
##        14) smoothness_worst>=-1.464982 4   2 M (0.50000000 0.50000000)  
##          28) texture_mean< 2.90276 2   0 M (1.00000000 0.00000000) *
##          29) texture_mean>=2.90276 2   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst< -1.464982 39   0 B (0.00000000 1.00000000) *
## 
## $trees[[82]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 434 M (0.52412281 0.47587719)  
##     2) symmetry_worst>=-1.641484 356 133 M (0.62640449 0.37359551)  
##       4) smoothness_mean>=-2.320393 169  40 M (0.76331361 0.23668639)  
##         8) compactness_se< -2.780114 162  33 M (0.79629630 0.20370370)  
##          16) compactness_se>=-4.463708 157  28 M (0.82165605 0.17834395)  
##            32) smoothness_mean< -2.301086 34   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean>=-2.301086 123  28 M (0.77235772 0.22764228)  
##              66) smoothness_mean>=-2.296604 118  23 M (0.80508475 0.19491525) *
##              67) smoothness_mean< -2.296604 5   0 B (0.00000000 1.00000000) *
##          17) compactness_se< -4.463708 5   0 B (0.00000000 1.00000000) *
##         9) compactness_se>=-2.780114 7   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean< -2.320393 187  93 M (0.50267380 0.49732620)  
##        10) compactness_se< -4.178455 62  14 M (0.77419355 0.22580645)  
##          20) texture_mean< 3.110176 50   6 M (0.88000000 0.12000000)  
##            40) texture_mean>=2.906784 47   3 M (0.93617021 0.06382979)  
##              80) texture_worst>=4.340304 46   2 M (0.95652174 0.04347826) *
##              81) texture_worst< 4.340304 1   0 B (0.00000000 1.00000000) *
##            41) texture_mean< 2.906784 3   0 B (0.00000000 1.00000000) *
##          21) texture_mean>=3.110176 12   4 B (0.33333333 0.66666667)  
##            42) texture_worst>=5.204837 4   0 M (1.00000000 0.00000000) *
##            43) texture_worst< 5.204837 8   0 B (0.00000000 1.00000000) *
##        11) compactness_se>=-4.178455 125  46 B (0.36800000 0.63200000)  
##          22) texture_worst>=4.993407 13   0 M (1.00000000 0.00000000) *
##          23) texture_worst< 4.993407 112  33 B (0.29464286 0.70535714)  
##            46) symmetry_worst< -1.638169 14   1 M (0.92857143 0.07142857)  
##              92) texture_mean>=2.7241 13   0 M (1.00000000 0.00000000) *
##              93) texture_mean< 2.7241 1   0 B (0.00000000 1.00000000) *
##            47) symmetry_worst>=-1.638169 98  20 B (0.20408163 0.79591837)  
##              94) symmetry_worst>=-1.429489 19   8 M (0.57894737 0.42105263) *
##              95) symmetry_worst< -1.429489 79   9 B (0.11392405 0.88607595) *
##     3) symmetry_worst< -1.641484 556 255 B (0.45863309 0.54136691)  
##       6) smoothness_mean< -2.237735 479 238 M (0.50313152 0.49686848)  
##        12) smoothness_mean>=-2.257137 21   1 M (0.95238095 0.04761905)  
##          24) compactness_se>=-3.898257 20   0 M (1.00000000 0.00000000) *
##          25) compactness_se< -3.898257 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.257137 458 221 B (0.48253275 0.51746725)  
##          26) symmetry_worst< -1.750623 349 163 M (0.53295129 0.46704871)  
##            52) texture_worst>=4.897936 81  23 M (0.71604938 0.28395062)  
##             104) symmetry_worst>=-2.257286 75  17 M (0.77333333 0.22666667) *
##             105) symmetry_worst< -2.257286 6   0 B (0.00000000 1.00000000) *
##            53) texture_worst< 4.897936 268 128 B (0.47761194 0.52238806)  
##             106) texture_worst< 4.751358 239 112 M (0.53138075 0.46861925) *
##             107) texture_worst>=4.751358 29   1 B (0.03448276 0.96551724) *
##          27) symmetry_worst>=-1.750623 109  35 B (0.32110092 0.67889908)  
##            54) texture_mean>=2.955415 71  35 B (0.49295775 0.50704225)  
##             108) symmetry_worst>=-1.716495 27   3 M (0.88888889 0.11111111) *
##             109) symmetry_worst< -1.716495 44  11 B (0.25000000 0.75000000) *
##            55) texture_mean< 2.955415 38   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean>=-2.237735 77  14 B (0.18181818 0.81818182)  
##        14) smoothness_worst< -1.56036 3   0 M (1.00000000 0.00000000) *
##        15) smoothness_worst>=-1.56036 74  11 B (0.14864865 0.85135135)  
##          30) texture_mean>=3.044046 12   6 M (0.50000000 0.50000000)  
##            60) smoothness_mean< -2.120284 6   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean>=-2.120284 6   0 B (0.00000000 1.00000000) *
##          31) texture_mean< 3.044046 62   5 B (0.08064516 0.91935484)  
##            62) compactness_se>=-3.011681 2   0 M (1.00000000 0.00000000) *
##            63) compactness_se< -3.011681 60   3 B (0.05000000 0.95000000)  
##             126) compactness_se< -4.140724 10   3 B (0.30000000 0.70000000) *
##             127) compactness_se>=-4.140724 50   0 B (0.00000000 1.00000000) *
## 
## $trees[[83]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 419 B (0.45942982 0.54057018)  
##     2) symmetry_worst>=-1.658814 361 161 M (0.55401662 0.44598338)  
##       4) texture_worst>=4.605737 211  68 M (0.67772512 0.32227488)  
##         8) symmetry_worst< -1.606972 49   5 M (0.89795918 0.10204082)  
##          16) texture_mean< 3.14185 43   0 M (1.00000000 0.00000000) *
##          17) texture_mean>=3.14185 6   1 B (0.16666667 0.83333333)  
##            34) texture_mean>=3.244071 1   0 M (1.00000000 0.00000000) *
##            35) texture_mean< 3.244071 5   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.606972 162  63 M (0.61111111 0.38888889)  
##          18) symmetry_worst>=-1.591238 151  52 M (0.65562914 0.34437086)  
##            36) texture_worst>=4.993407 30   2 M (0.93333333 0.06666667)  
##              72) compactness_se>=-4.507761 28   0 M (1.00000000 0.00000000) *
##              73) compactness_se< -4.507761 2   0 B (0.00000000 1.00000000) *
##            37) texture_worst< 4.993407 121  50 M (0.58677686 0.41322314)  
##              74) compactness_se< -3.768789 65  15 M (0.76923077 0.23076923) *
##              75) compactness_se>=-3.768789 56  21 B (0.37500000 0.62500000) *
##          19) symmetry_worst< -1.591238 11   0 B (0.00000000 1.00000000) *
##       5) texture_worst< 4.605737 150  57 B (0.38000000 0.62000000)  
##        10) smoothness_mean>=-2.171581 28   4 M (0.85714286 0.14285714)  
##          20) compactness_se>=-3.95959 26   2 M (0.92307692 0.07692308)  
##            40) smoothness_worst< -1.333822 23   0 M (1.00000000 0.00000000) *
##            41) smoothness_worst>=-1.333822 3   1 B (0.33333333 0.66666667)  
##              82) texture_mean>=2.688296 1   0 M (1.00000000 0.00000000) *
##              83) texture_mean< 2.688296 2   0 B (0.00000000 1.00000000) *
##          21) compactness_se< -3.95959 2   0 B (0.00000000 1.00000000) *
##        11) smoothness_mean< -2.171581 122  33 B (0.27049180 0.72950820)  
##          22) smoothness_worst>=-1.472112 43  21 B (0.48837209 0.51162791)  
##            44) symmetry_worst< -1.397194 17   2 M (0.88235294 0.11764706)  
##              88) texture_worst>=4.110502 15   0 M (1.00000000 0.00000000) *
##              89) texture_worst< 4.110502 2   0 B (0.00000000 1.00000000) *
##            45) symmetry_worst>=-1.397194 26   6 B (0.23076923 0.76923077)  
##              90) texture_worst< 4.074625 5   0 M (1.00000000 0.00000000) *
##              91) texture_worst>=4.074625 21   1 B (0.04761905 0.95238095) *
##          23) smoothness_worst< -1.472112 79  12 B (0.15189873 0.84810127)  
##            46) texture_mean>=2.975525 9   2 M (0.77777778 0.22222222)  
##              92) smoothness_mean< -2.275789 7   0 M (1.00000000 0.00000000) *
##              93) smoothness_mean>=-2.275789 2   0 B (0.00000000 1.00000000) *
##            47) texture_mean< 2.975525 70   5 B (0.07142857 0.92857143)  
##              94) smoothness_worst>=-1.496237 21   5 B (0.23809524 0.76190476) *
##              95) smoothness_worst< -1.496237 49   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.658814 551 219 B (0.39745917 0.60254083)  
##       6) symmetry_worst< -1.681676 525 219 B (0.41714286 0.58285714)  
##        12) smoothness_worst< -1.476214 417 192 B (0.46043165 0.53956835)  
##          24) smoothness_worst>=-1.482699 31   2 M (0.93548387 0.06451613)  
##            48) compactness_se>=-3.967101 27   0 M (1.00000000 0.00000000) *
##            49) compactness_se< -3.967101 4   2 M (0.50000000 0.50000000)  
##              98) texture_mean>=2.981733 2   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 2.981733 2   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst< -1.482699 386 163 B (0.42227979 0.57772021)  
##            50) symmetry_worst>=-1.815934 122  50 M (0.59016393 0.40983607)  
##             100) smoothness_worst< -1.484675 104  32 M (0.69230769 0.30769231) *
##             101) smoothness_worst>=-1.484675 18   0 B (0.00000000 1.00000000) *
##            51) symmetry_worst< -1.815934 264  91 B (0.34469697 0.65530303)  
##             102) texture_worst>=4.624749 102  48 M (0.52941176 0.47058824) *
##             103) texture_worst< 4.624749 162  37 B (0.22839506 0.77160494) *
##        13) smoothness_worst>=-1.476214 108  27 B (0.25000000 0.75000000)  
##          26) compactness_se>=-3.294139 11   3 M (0.72727273 0.27272727)  
##            52) texture_mean< 3.23593 8   0 M (1.00000000 0.00000000) *
##            53) texture_mean>=3.23593 3   0 B (0.00000000 1.00000000) *
##          27) compactness_se< -3.294139 97  19 B (0.19587629 0.80412371)  
##            54) texture_worst< 4.624204 40  14 B (0.35000000 0.65000000)  
##             108) texture_worst>=4.373034 15   1 M (0.93333333 0.06666667) *
##             109) texture_worst< 4.373034 25   0 B (0.00000000 1.00000000) *
##            55) texture_worst>=4.624204 57   5 B (0.08771930 0.91228070)  
##             110) texture_mean>=3.207548 2   0 M (1.00000000 0.00000000) *
##             111) texture_mean< 3.207548 55   3 B (0.05454545 0.94545455) *
##       7) symmetry_worst>=-1.681676 26   0 B (0.00000000 1.00000000) *
## 
## $trees[[84]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 413 B (0.45285088 0.54714912)  
##     2) symmetry_worst>=-1.580867 257 111 M (0.56809339 0.43190661)  
##       4) texture_worst>=4.477941 193  64 M (0.66839378 0.33160622)  
##         8) symmetry_worst< -1.557842 25   0 M (1.00000000 0.00000000) *
##         9) symmetry_worst>=-1.557842 168  64 M (0.61904762 0.38095238)  
##          18) symmetry_worst>=-1.549706 153  49 M (0.67973856 0.32026144)  
##            36) smoothness_worst< -1.513087 58   8 M (0.86206897 0.13793103)  
##              72) texture_mean>=2.904002 55   5 M (0.90909091 0.09090909) *
##              73) texture_mean< 2.904002 3   0 B (0.00000000 1.00000000) *
##            37) smoothness_worst>=-1.513087 95  41 M (0.56842105 0.43157895)  
##              74) smoothness_mean>=-2.277448 38   5 M (0.86842105 0.13157895) *
##              75) smoothness_mean< -2.277448 57  21 B (0.36842105 0.63157895) *
##          19) symmetry_worst< -1.549706 15   0 B (0.00000000 1.00000000) *
##       5) texture_worst< 4.477941 64  17 B (0.26562500 0.73437500)  
##        10) smoothness_worst>=-1.496237 36  17 B (0.47222222 0.52777778)  
##          20) texture_mean>=2.803301 13   1 M (0.92307692 0.07692308)  
##            40) compactness_se< -2.679301 12   0 M (1.00000000 0.00000000) *
##            41) compactness_se>=-2.679301 1   0 B (0.00000000 1.00000000) *
##          21) texture_mean< 2.803301 23   5 B (0.21739130 0.78260870)  
##            42) texture_mean< 2.531355 3   0 M (1.00000000 0.00000000) *
##            43) texture_mean>=2.531355 20   2 B (0.10000000 0.90000000)  
##              86) compactness_se>=-3.1317 2   0 M (1.00000000 0.00000000) *
##              87) compactness_se< -3.1317 18   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.496237 28   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -1.580867 655 267 B (0.40763359 0.59236641)  
##       6) texture_mean>=3.431382 14   0 M (1.00000000 0.00000000) *
##       7) texture_mean< 3.431382 641 253 B (0.39469579 0.60530421)  
##        14) texture_mean< 3.21023 557 236 B (0.42369838 0.57630162)  
##          28) texture_worst>=5.016194 22   3 M (0.86363636 0.13636364)  
##            56) texture_worst< 5.280287 19   0 M (1.00000000 0.00000000) *
##            57) texture_worst>=5.280287 3   0 B (0.00000000 1.00000000) *
##          29) texture_worst< 5.016194 535 217 B (0.40560748 0.59439252)  
##            58) compactness_se< -3.391153 449 199 B (0.44320713 0.55679287)  
##             116) compactness_se>=-3.772915 172  69 M (0.59883721 0.40116279) *
##             117) compactness_se< -3.772915 277  96 B (0.34657040 0.65342960) *
##            59) compactness_se>=-3.391153 86  18 B (0.20930233 0.79069767)  
##             118) texture_mean>=3.038537 26   9 M (0.65384615 0.34615385) *
##             119) texture_mean< 3.038537 60   1 B (0.01666667 0.98333333) *
##        15) texture_mean>=3.21023 84  17 B (0.20238095 0.79761905)  
##          30) symmetry_worst>=-1.709835 5   0 M (1.00000000 0.00000000) *
##          31) symmetry_worst< -1.709835 79  12 B (0.15189873 0.84810127)  
##            62) compactness_se>=-3.424051 5   1 M (0.80000000 0.20000000)  
##             124) smoothness_mean< -2.457972 4   0 M (1.00000000 0.00000000) *
##             125) smoothness_mean>=-2.457972 1   0 B (0.00000000 1.00000000) *
##            63) compactness_se< -3.424051 74   8 B (0.10810811 0.89189189)  
##             126) smoothness_worst>=-1.435634 2   0 M (1.00000000 0.00000000) *
##             127) smoothness_worst< -1.435634 72   6 B (0.08333333 0.91666667) *
## 
## $trees[[85]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 442 B (0.48464912 0.51535088)  
##     2) smoothness_mean>=-2.392182 553 246 M (0.55515371 0.44484629)  
##       4) smoothness_mean< -2.38347 31   0 M (1.00000000 0.00000000) *
##       5) smoothness_mean>=-2.38347 522 246 M (0.52873563 0.47126437)  
##        10) smoothness_mean>=-2.380331 509 233 M (0.54223969 0.45776031)  
##          20) smoothness_worst< -1.562856 50   9 M (0.82000000 0.18000000)  
##            40) smoothness_worst>=-1.574324 18   0 M (1.00000000 0.00000000) *
##            41) smoothness_worst< -1.574324 32   9 M (0.71875000 0.28125000)  
##              82) smoothness_mean>=-2.337942 24   3 M (0.87500000 0.12500000) *
##              83) smoothness_mean< -2.337942 8   2 B (0.25000000 0.75000000) *
##          21) smoothness_worst>=-1.562856 459 224 M (0.51198257 0.48801743)  
##            42) symmetry_worst>=-2.151948 422 192 M (0.54502370 0.45497630)  
##              84) compactness_se>=-4.50262 406 176 M (0.56650246 0.43349754) *
##              85) compactness_se< -4.50262 16   0 B (0.00000000 1.00000000) *
##            43) symmetry_worst< -2.151948 37   5 B (0.13513514 0.86486486)  
##              86) compactness_se>=-3.382349 4   1 M (0.75000000 0.25000000) *
##              87) compactness_se< -3.382349 33   2 B (0.06060606 0.93939394) *
##        11) smoothness_mean< -2.380331 13   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.392182 359 135 B (0.37604457 0.62395543)  
##       6) compactness_se< -3.941776 156  72 M (0.53846154 0.46153846)  
##        12) smoothness_mean>=-2.422045 44   6 M (0.86363636 0.13636364)  
##          24) smoothness_mean< -2.399979 39   1 M (0.97435897 0.02564103)  
##            48) symmetry_worst< -1.677281 33   0 M (1.00000000 0.00000000) *
##            49) symmetry_worst>=-1.677281 6   1 M (0.83333333 0.16666667)  
##              98) compactness_se>=-4.280953 5   0 M (1.00000000 0.00000000) *
##              99) compactness_se< -4.280953 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean>=-2.399979 5   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.422045 112  46 B (0.41071429 0.58928571)  
##          26) smoothness_mean< -2.443803 83  38 M (0.54216867 0.45783133)  
##            52) symmetry_worst>=-1.562165 19   1 M (0.94736842 0.05263158)  
##             104) texture_mean>=2.84952 18   0 M (1.00000000 0.00000000) *
##             105) texture_mean< 2.84952 1   0 B (0.00000000 1.00000000) *
##            53) symmetry_worst< -1.562165 64  27 B (0.42187500 0.57812500)  
##             106) smoothness_worst>=-1.638322 52  25 M (0.51923077 0.48076923) *
##             107) smoothness_worst< -1.638322 12   0 B (0.00000000 1.00000000) *
##          27) smoothness_mean>=-2.443803 29   1 B (0.03448276 0.96551724)  
##            54) smoothness_worst< -1.607486 2   1 M (0.50000000 0.50000000)  
##             108) texture_mean>=2.884013 1   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 2.884013 1   0 B (0.00000000 1.00000000) *
##            55) smoothness_worst>=-1.607486 27   0 B (0.00000000 1.00000000) *
##       7) compactness_se>=-3.941776 203  51 B (0.25123153 0.74876847)  
##        14) smoothness_mean< -2.461054 88  42 B (0.47727273 0.52272727)  
##          28) smoothness_worst>=-1.556752 16   0 M (1.00000000 0.00000000) *
##          29) smoothness_worst< -1.556752 72  26 B (0.36111111 0.63888889)  
##            58) symmetry_worst< -2.106078 18   2 M (0.88888889 0.11111111)  
##             116) texture_mean>=3.076827 16   0 M (1.00000000 0.00000000) *
##             117) texture_mean< 3.076827 2   0 B (0.00000000 1.00000000) *
##            59) symmetry_worst>=-2.106078 54  10 B (0.18518519 0.81481481)  
##             118) symmetry_worst>=-1.816662 30  10 B (0.33333333 0.66666667) *
##             119) symmetry_worst< -1.816662 24   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean>=-2.461054 115   9 B (0.07826087 0.92173913)  
##          30) texture_mean< 2.788514 7   2 M (0.71428571 0.28571429)  
##            60) texture_mean>=2.735767 5   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.735767 2   0 B (0.00000000 1.00000000) *
##          31) texture_mean>=2.788514 108   4 B (0.03703704 0.96296296)  
##            62) symmetry_worst>=-1.783406 34   4 B (0.11764706 0.88235294)  
##             124) symmetry_worst< -1.685469 4   0 M (1.00000000 0.00000000) *
##             125) symmetry_worst>=-1.685469 30   0 B (0.00000000 1.00000000) *
##            63) symmetry_worst< -1.783406 74   0 B (0.00000000 1.00000000) *
## 
## $trees[[86]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 363 B (0.39802632 0.60197368)  
##     2) compactness_se< -4.098353 266 127 M (0.52255639 0.47744361)  
##       4) smoothness_mean< -2.299097 212  84 M (0.60377358 0.39622642)  
##         8) smoothness_mean>=-2.426508 111  16 M (0.85585586 0.14414414)  
##          16) texture_mean>=2.799406 108  13 M (0.87962963 0.12037037)  
##            32) smoothness_worst>=-1.555906 80   3 M (0.96250000 0.03750000)  
##              64) symmetry_worst>=-2.212871 78   1 M (0.98717949 0.01282051) *
##              65) symmetry_worst< -2.212871 2   0 B (0.00000000 1.00000000) *
##            33) smoothness_worst< -1.555906 28  10 M (0.64285714 0.35714286)  
##              66) smoothness_worst< -1.567258 21   3 M (0.85714286 0.14285714) *
##              67) smoothness_worst>=-1.567258 7   0 B (0.00000000 1.00000000) *
##          17) texture_mean< 2.799406 3   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.426508 101  33 B (0.32673267 0.67326733)  
##          18) symmetry_worst>=-1.695215 32  12 M (0.62500000 0.37500000)  
##            36) smoothness_mean< -2.449649 24   4 M (0.83333333 0.16666667)  
##              72) smoothness_worst< -1.549205 21   1 M (0.95238095 0.04761905) *
##              73) smoothness_worst>=-1.549205 3   0 B (0.00000000 1.00000000) *
##            37) smoothness_mean>=-2.449649 8   0 B (0.00000000 1.00000000) *
##          19) symmetry_worst< -1.695215 69  13 B (0.18840580 0.81159420)  
##            38) smoothness_worst>=-1.55307 15   7 M (0.53333333 0.46666667)  
##              76) smoothness_mean< -2.440656 8   0 M (1.00000000 0.00000000) *
##              77) smoothness_mean>=-2.440656 7   0 B (0.00000000 1.00000000) *
##            39) smoothness_worst< -1.55307 54   5 B (0.09259259 0.90740741)  
##              78) texture_mean< 2.969886 16   5 B (0.31250000 0.68750000) *
##              79) texture_mean>=2.969886 38   0 B (0.00000000 1.00000000) *
##       5) smoothness_mean>=-2.299097 54  11 B (0.20370370 0.79629630)  
##        10) compactness_se>=-4.222363 27  11 B (0.40740741 0.59259259)  
##          20) compactness_se< -4.178775 12   1 M (0.91666667 0.08333333)  
##            40) smoothness_worst>=-1.49438 11   0 M (1.00000000 0.00000000) *
##            41) smoothness_worst< -1.49438 1   0 B (0.00000000 1.00000000) *
##          21) compactness_se>=-4.178775 15   0 B (0.00000000 1.00000000) *
##        11) compactness_se< -4.222363 27   0 B (0.00000000 1.00000000) *
##     3) compactness_se>=-4.098353 646 224 B (0.34674923 0.65325077)  
##       6) smoothness_mean>=-2.394871 470 190 B (0.40425532 0.59574468)  
##        12) compactness_se>=-4.025757 425 190 B (0.44705882 0.55294118)  
##          24) texture_worst>=4.895983 65  20 M (0.69230769 0.30769231)  
##            48) symmetry_worst>=-2.207988 55  10 M (0.81818182 0.18181818)  
##              96) texture_mean< 3.36829 49   4 M (0.91836735 0.08163265) *
##              97) texture_mean>=3.36829 6   0 B (0.00000000 1.00000000) *
##            49) symmetry_worst< -2.207988 10   0 B (0.00000000 1.00000000) *
##          25) texture_worst< 4.895983 360 145 B (0.40277778 0.59722222)  
##            50) texture_worst< 4.782287 302 136 B (0.45033113 0.54966887)  
##             100) smoothness_mean< -2.366217 21   2 M (0.90476190 0.09523810) *
##             101) smoothness_mean>=-2.366217 281 117 B (0.41637011 0.58362989) *
##            51) texture_worst>=4.782287 58   9 B (0.15517241 0.84482759)  
##             102) compactness_se>=-2.785754 7   0 M (1.00000000 0.00000000) *
##             103) compactness_se< -2.785754 51   2 B (0.03921569 0.96078431) *
##        13) compactness_se< -4.025757 45   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean< -2.394871 176  34 B (0.19318182 0.80681818)  
##        14) symmetry_worst>=-1.466953 8   2 M (0.75000000 0.25000000)  
##          28) texture_worst< 4.774321 5   0 M (1.00000000 0.00000000) *
##          29) texture_worst>=4.774321 3   1 B (0.33333333 0.66666667)  
##            58) texture_mean>=3.044129 1   0 M (1.00000000 0.00000000) *
##            59) texture_mean< 3.044129 2   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.466953 168  28 B (0.16666667 0.83333333)  
##          30) smoothness_worst< -1.598711 56  19 B (0.33928571 0.66071429)  
##            60) smoothness_worst>=-1.653746 18   5 M (0.72222222 0.27777778)  
##             120) compactness_se< -3.268044 14   1 M (0.92857143 0.07142857) *
##             121) compactness_se>=-3.268044 4   0 B (0.00000000 1.00000000) *
##            61) smoothness_worst< -1.653746 38   6 B (0.15789474 0.84210526)  
##             122) compactness_se>=-2.979429 8   2 M (0.75000000 0.25000000) *
##             123) compactness_se< -2.979429 30   0 B (0.00000000 1.00000000) *
##          31) smoothness_worst>=-1.598711 112   9 B (0.08035714 0.91964286)  
##            62) texture_worst>=5.19153 4   1 M (0.75000000 0.25000000)  
##             124) smoothness_mean< -2.473552 3   0 M (1.00000000 0.00000000) *
##             125) smoothness_mean>=-2.473552 1   0 B (0.00000000 1.00000000) *
##            63) texture_worst< 5.19153 108   6 B (0.05555556 0.94444444)  
##             126) texture_worst>=4.568716 42   6 B (0.14285714 0.85714286) *
##             127) texture_worst< 4.568716 66   0 B (0.00000000 1.00000000) *
## 
## $trees[[87]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 451 M (0.50548246 0.49451754)  
##     2) smoothness_mean>=-2.423454 682 305 M (0.55278592 0.44721408)  
##       4) smoothness_worst< -1.4768 397 150 M (0.62216625 0.37783375)  
##         8) smoothness_worst>=-1.482107 52   2 M (0.96153846 0.03846154)  
##          16) texture_worst>=4.126187 50   0 M (1.00000000 0.00000000) *
##          17) texture_worst< 4.126187 2   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.482107 345 148 M (0.57101449 0.42898551)  
##          18) compactness_se< -4.100467 83  21 M (0.74698795 0.25301205)  
##            36) smoothness_mean< -2.3007 75  13 M (0.82666667 0.17333333)  
##              72) texture_mean>=2.809289 71   9 M (0.87323944 0.12676056) *
##              73) texture_mean< 2.809289 4   0 B (0.00000000 1.00000000) *
##            37) smoothness_mean>=-2.3007 8   0 B (0.00000000 1.00000000) *
##          19) compactness_se>=-4.100467 262 127 M (0.51526718 0.48473282)  
##            38) symmetry_worst>=-1.835199 156  56 M (0.64102564 0.35897436)  
##              76) texture_worst>=4.57172 78  14 M (0.82051282 0.17948718) *
##              77) texture_worst< 4.57172 78  36 B (0.46153846 0.53846154) *
##            39) symmetry_worst< -1.835199 106  35 B (0.33018868 0.66981132)  
##              78) symmetry_worst< -2.923662 9   0 M (1.00000000 0.00000000) *
##              79) symmetry_worst>=-2.923662 97  26 B (0.26804124 0.73195876) *
##       5) smoothness_worst>=-1.4768 285 130 B (0.45614035 0.54385965)  
##        10) smoothness_worst>=-1.473476 251 121 M (0.51792829 0.48207171)  
##          20) symmetry_worst>=-1.721298 160  57 M (0.64375000 0.35625000)  
##            40) compactness_se>=-4.04059 112  27 M (0.75892857 0.24107143)  
##              80) smoothness_mean>=-2.359377 97  13 M (0.86597938 0.13402062) *
##              81) smoothness_mean< -2.359377 15   1 B (0.06666667 0.93333333) *
##            41) compactness_se< -4.04059 48  18 B (0.37500000 0.62500000)  
##              82) smoothness_mean< -2.294648 21   3 M (0.85714286 0.14285714) *
##              83) smoothness_mean>=-2.294648 27   0 B (0.00000000 1.00000000) *
##          21) symmetry_worst< -1.721298 91  27 B (0.29670330 0.70329670)  
##            42) texture_worst< 4.623467 23   6 M (0.73913043 0.26086957)  
##              84) texture_mean>=2.84692 18   1 M (0.94444444 0.05555556) *
##              85) texture_mean< 2.84692 5   0 B (0.00000000 1.00000000) *
##            43) texture_worst>=4.623467 68  10 B (0.14705882 0.85294118)  
##              86) symmetry_worst< -1.820896 18   8 M (0.55555556 0.44444444) *
##              87) symmetry_worst>=-1.820896 50   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.473476 34   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.423454 230  84 B (0.36521739 0.63478261)  
##       6) smoothness_mean< -2.441446 198  83 B (0.41919192 0.58080808)  
##        12) symmetry_worst>=-1.54778 21   4 M (0.80952381 0.19047619)  
##          24) smoothness_mean>=-2.487591 18   1 M (0.94444444 0.05555556)  
##            48) texture_mean>=2.844831 17   0 M (1.00000000 0.00000000) *
##            49) texture_mean< 2.844831 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean< -2.487591 3   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst< -1.54778 177  66 B (0.37288136 0.62711864)  
##          26) symmetry_worst< -1.750953 132  60 B (0.45454545 0.54545455)  
##            52) symmetry_worst>=-1.868413 47  15 M (0.68085106 0.31914894)  
##             104) texture_mean< 2.977229 32   4 M (0.87500000 0.12500000) *
##             105) texture_mean>=2.977229 15   4 B (0.26666667 0.73333333) *
##            53) symmetry_worst< -1.868413 85  28 B (0.32941176 0.67058824)  
##             106) compactness_se>=-3.514597 44  20 M (0.54545455 0.45454545) *
##             107) compactness_se< -3.514597 41   4 B (0.09756098 0.90243902) *
##          27) symmetry_worst>=-1.750953 45   6 B (0.13333333 0.86666667)  
##            54) texture_mean>=2.958874 17   6 B (0.35294118 0.64705882)  
##             108) compactness_se< -4.196102 7   1 M (0.85714286 0.14285714) *
##             109) compactness_se>=-4.196102 10   0 B (0.00000000 1.00000000) *
##            55) texture_mean< 2.958874 28   0 B (0.00000000 1.00000000) *
##       7) smoothness_mean>=-2.441446 32   1 B (0.03125000 0.96875000)  
##        14) smoothness_worst< -1.607486 1   0 M (1.00000000 0.00000000) *
##        15) smoothness_worst>=-1.607486 31   0 B (0.00000000 1.00000000) *
## 
## $trees[[88]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 452 M (0.50438596 0.49561404)  
##     2) smoothness_mean>=-2.367658 551 234 M (0.57531760 0.42468240)  
##       4) smoothness_worst< -1.476605 301  92 M (0.69435216 0.30564784)  
##         8) smoothness_worst>=-1.499656 100  10 M (0.90000000 0.10000000)  
##          16) compactness_se>=-3.907039 77   1 M (0.98701299 0.01298701)  
##            32) smoothness_mean< -2.221522 67   0 M (1.00000000 0.00000000) *
##            33) smoothness_mean>=-2.221522 10   1 M (0.90000000 0.10000000)  
##              66) smoothness_mean>=-2.204571 9   0 M (1.00000000 0.00000000) *
##              67) smoothness_mean< -2.204571 1   0 B (0.00000000 1.00000000) *
##          17) compactness_se< -3.907039 23   9 M (0.60869565 0.39130435)  
##            34) smoothness_mean< -2.262441 17   3 M (0.82352941 0.17647059)  
##              68) compactness_se>=-4.224388 14   0 M (1.00000000 0.00000000) *
##              69) compactness_se< -4.224388 3   0 B (0.00000000 1.00000000) *
##            35) smoothness_mean>=-2.262441 6   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.499656 201  82 M (0.59203980 0.40796020)  
##          18) symmetry_worst< -1.543306 183  64 M (0.65027322 0.34972678)  
##            36) smoothness_mean< -2.313857 81  13 M (0.83950617 0.16049383)  
##              72) compactness_se< -3.492659 63   4 M (0.93650794 0.06349206) *
##              73) compactness_se>=-3.492659 18   9 M (0.50000000 0.50000000) *
##            37) smoothness_mean>=-2.313857 102  51 M (0.50000000 0.50000000)  
##              74) compactness_se>=-3.685572 60  18 M (0.70000000 0.30000000) *
##              75) compactness_se< -3.685572 42   9 B (0.21428571 0.78571429) *
##          19) symmetry_worst>=-1.543306 18   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.476605 250 108 B (0.43200000 0.56800000)  
##        10) smoothness_worst>=-1.473672 221 108 B (0.48868778 0.51131222)  
##          20) texture_worst< 4.94309 186  82 M (0.55913978 0.44086022)  
##            40) smoothness_mean< -2.306533 57   9 M (0.84210526 0.15789474)  
##              80) smoothness_mean>=-2.361754 48   0 M (1.00000000 0.00000000) *
##              81) smoothness_mean< -2.361754 9   0 B (0.00000000 1.00000000) *
##            41) smoothness_mean>=-2.306533 129  56 B (0.43410853 0.56589147)  
##              82) texture_worst>=4.398698 92  42 M (0.54347826 0.45652174) *
##              83) texture_worst< 4.398698 37   6 B (0.16216216 0.83783784) *
##          21) texture_worst>=4.94309 35   4 B (0.11428571 0.88571429)  
##            42) compactness_se>=-3.116694 3   0 M (1.00000000 0.00000000) *
##            43) compactness_se< -3.116694 32   1 B (0.03125000 0.96875000)  
##              86) texture_mean>=3.237842 1   0 M (1.00000000 0.00000000) *
##              87) texture_mean< 3.237842 31   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.473672 29   0 B (0.00000000 1.00000000) *
##     3) smoothness_mean< -2.367658 361 143 B (0.39612188 0.60387812)  
##       6) texture_mean< 3.336125 333 142 B (0.42642643 0.57357357)  
##        12) texture_worst>=4.975502 67  23 M (0.65671642 0.34328358)  
##          24) compactness_se>=-4.706178 58  14 M (0.75862069 0.24137931)  
##            48) symmetry_worst>=-2.145206 46   4 M (0.91304348 0.08695652)  
##              96) symmetry_worst< -1.637827 35   0 M (1.00000000 0.00000000) *
##              97) symmetry_worst>=-1.637827 11   4 M (0.63636364 0.36363636) *
##            49) symmetry_worst< -2.145206 12   2 B (0.16666667 0.83333333)  
##              98) texture_mean>=3.330945 2   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 3.330945 10   0 B (0.00000000 1.00000000) *
##          25) compactness_se< -4.706178 9   0 B (0.00000000 1.00000000) *
##        13) texture_worst< 4.975502 266  98 B (0.36842105 0.63157895)  
##          26) smoothness_worst>=-1.604472 186  82 B (0.44086022 0.55913978)  
##            52) symmetry_worst< -2.035676 22   3 M (0.86363636 0.13636364)  
##             104) smoothness_worst< -1.540052 20   1 M (0.95000000 0.05000000) *
##             105) smoothness_worst>=-1.540052 2   0 B (0.00000000 1.00000000) *
##            53) symmetry_worst>=-2.035676 164  63 B (0.38414634 0.61585366)  
##             106) texture_mean>=2.921008 115  54 B (0.46956522 0.53043478) *
##             107) texture_mean< 2.921008 49   9 B (0.18367347 0.81632653) *
##          27) smoothness_worst< -1.604472 80  16 B (0.20000000 0.80000000)  
##            54) symmetry_worst>=-1.777195 22  10 B (0.45454545 0.54545455)  
##             108) texture_mean>=2.939162 13   3 M (0.76923077 0.23076923) *
##             109) texture_mean< 2.939162 9   0 B (0.00000000 1.00000000) *
##            55) symmetry_worst< -1.777195 58   6 B (0.10344828 0.89655172)  
##             110) compactness_se>=-2.951614 10   4 M (0.60000000 0.40000000) *
##             111) compactness_se< -2.951614 48   0 B (0.00000000 1.00000000) *
##       7) texture_mean>=3.336125 28   1 B (0.03571429 0.96428571)  
##        14) texture_mean>=3.452615 1   0 M (1.00000000 0.00000000) *
##        15) texture_mean< 3.452615 27   0 B (0.00000000 1.00000000) *
## 
## $trees[[89]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 426 B (0.46710526 0.53289474)  
##     2) smoothness_worst< -1.434076 775 387 M (0.50064516 0.49935484)  
##       4) smoothness_mean>=-2.172878 31   0 M (1.00000000 0.00000000) *
##       5) smoothness_mean< -2.172878 744 357 B (0.47983871 0.52016129)  
##        10) smoothness_worst>=-1.603315 627 302 M (0.51834131 0.48165869)  
##          20) texture_worst>=4.579906 328 126 M (0.61585366 0.38414634)  
##            40) texture_worst< 4.756552 150  32 M (0.78666667 0.21333333)  
##              80) texture_mean>=3.055881 60   0 M (1.00000000 0.00000000) *
##              81) texture_mean< 3.055881 90  32 M (0.64444444 0.35555556) *
##            41) texture_worst>=4.756552 178  84 B (0.47191011 0.52808989)  
##              82) smoothness_worst>=-1.504916 82  31 M (0.62195122 0.37804878) *
##              83) smoothness_worst< -1.504916 96  33 B (0.34375000 0.65625000) *
##          21) texture_worst< 4.579906 299 123 B (0.41137124 0.58862876)  
##            42) texture_worst< 4.545141 253 118 B (0.46640316 0.53359684)  
##              84) texture_worst>=4.522453 35   4 M (0.88571429 0.11428571) *
##              85) texture_worst< 4.522453 218  87 B (0.39908257 0.60091743) *
##            43) texture_worst>=4.545141 46   5 B (0.10869565 0.89130435)  
##              86) texture_mean>=3.035431 5   0 M (1.00000000 0.00000000) *
##              87) texture_mean< 3.035431 41   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.603315 117  32 B (0.27350427 0.72649573)  
##          22) symmetry_worst>=-1.550826 14   4 M (0.71428571 0.28571429)  
##            44) smoothness_mean>=-2.43698 9   0 M (1.00000000 0.00000000) *
##            45) smoothness_mean< -2.43698 5   1 B (0.20000000 0.80000000)  
##              90) symmetry_worst>=-1.211778 1   0 M (1.00000000 0.00000000) *
##              91) symmetry_worst< -1.211778 4   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst< -1.550826 103  22 B (0.21359223 0.78640777)  
##            46) texture_mean< 2.966301 22  10 M (0.54545455 0.45454545)  
##              92) texture_mean>=2.923842 16   4 M (0.75000000 0.25000000) *
##              93) texture_mean< 2.923842 6   0 B (0.00000000 1.00000000) *
##            47) texture_mean>=2.966301 81  10 B (0.12345679 0.87654321)  
##              94) smoothness_worst< -1.720903 11   5 B (0.45454545 0.54545455) *
##              95) smoothness_worst>=-1.720903 70   5 B (0.07142857 0.92857143) *
##     3) smoothness_worst>=-1.434076 137  38 B (0.27737226 0.72262774)  
##       6) symmetry_worst>=-1.270655 11   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst< -1.270655 126  27 B (0.21428571 0.78571429)  
##        14) smoothness_mean>=-1.977294 8   1 M (0.87500000 0.12500000)  
##          28) texture_mean>=2.649801 7   0 M (1.00000000 0.00000000) *
##          29) texture_mean< 2.649801 1   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -1.977294 118  20 B (0.16949153 0.83050847)  
##          30) compactness_se>=-3.311998 12   4 M (0.66666667 0.33333333)  
##            60) smoothness_mean>=-2.314128 8   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean< -2.314128 4   0 B (0.00000000 1.00000000) *
##          31) compactness_se< -3.311998 106  12 B (0.11320755 0.88679245)  
##            62) smoothness_mean< -2.305218 9   4 M (0.55555556 0.44444444)  
##             124) texture_mean>=3.075523 4   0 M (1.00000000 0.00000000) *
##             125) texture_mean< 3.075523 5   1 B (0.20000000 0.80000000) *
##            63) smoothness_mean>=-2.305218 97   7 B (0.07216495 0.92783505)  
##             126) compactness_se< -3.475452 50   7 B (0.14000000 0.86000000) *
##             127) compactness_se>=-3.475452 47   0 B (0.00000000 1.00000000) *
## 
## $trees[[90]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 436 B (0.47807018 0.52192982)  
##     2) smoothness_worst< -1.434076 790 389 M (0.50759494 0.49240506)  
##       4) smoothness_worst>=-1.603315 662 301 M (0.54531722 0.45468278)  
##         8) smoothness_mean>=-2.477713 633 277 M (0.56240126 0.43759874)  
##          16) texture_mean< 2.81481 83  19 M (0.77108434 0.22891566)  
##            32) compactness_se>=-3.964431 76  12 M (0.84210526 0.15789474)  
##              64) compactness_se< -3.340373 67   5 M (0.92537313 0.07462687) *
##              65) compactness_se>=-3.340373 9   2 B (0.22222222 0.77777778) *
##            33) compactness_se< -3.964431 7   0 B (0.00000000 1.00000000) *
##          17) texture_mean>=2.81481 550 258 M (0.53090909 0.46909091)  
##            34) smoothness_worst< -1.59596 16   0 M (1.00000000 0.00000000) *
##            35) smoothness_worst>=-1.59596 534 258 M (0.51685393 0.48314607)  
##              70) symmetry_worst>=-2.193154 475 212 M (0.55368421 0.44631579) *
##              71) symmetry_worst< -2.193154 59  13 B (0.22033898 0.77966102) *
##         9) smoothness_mean< -2.477713 29   5 B (0.17241379 0.82758621)  
##          18) symmetry_worst< -2.155071 2   0 M (1.00000000 0.00000000) *
##          19) symmetry_worst>=-2.155071 27   3 B (0.11111111 0.88888889)  
##            38) texture_worst>=5.057104 7   3 B (0.42857143 0.57142857)  
##              76) smoothness_worst< -1.565575 3   0 M (1.00000000 0.00000000) *
##              77) smoothness_worst>=-1.565575 4   0 B (0.00000000 1.00000000) *
##            39) texture_worst< 5.057104 20   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.603315 128  40 B (0.31250000 0.68750000)  
##        10) smoothness_worst< -1.723213 12   1 M (0.91666667 0.08333333)  
##          20) texture_mean>=3.026052 11   0 M (1.00000000 0.00000000) *
##          21) texture_mean< 3.026052 1   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst>=-1.723213 116  29 B (0.25000000 0.75000000)  
##          22) compactness_se< -4.200032 44  21 M (0.52272727 0.47727273)  
##            44) symmetry_worst>=-1.874628 31   9 M (0.70967742 0.29032258)  
##              88) texture_worst< 4.998675 24   2 M (0.91666667 0.08333333) *
##              89) texture_worst>=4.998675 7   0 B (0.00000000 1.00000000) *
##            45) symmetry_worst< -1.874628 13   1 B (0.07692308 0.92307692)  
##              90) texture_mean>=3.149769 1   0 M (1.00000000 0.00000000) *
##              91) texture_mean< 3.149769 12   0 B (0.00000000 1.00000000) *
##          23) compactness_se>=-4.200032 72   6 B (0.08333333 0.91666667)  
##            46) symmetry_worst>=-1.550826 5   1 M (0.80000000 0.20000000)  
##              92) smoothness_mean>=-2.49225 4   0 M (1.00000000 0.00000000) *
##              93) smoothness_mean< -2.49225 1   0 B (0.00000000 1.00000000) *
##            47) symmetry_worst< -1.550826 67   2 B (0.02985075 0.97014925)  
##              94) texture_mean< 2.958884 10   2 B (0.20000000 0.80000000) *
##              95) texture_mean>=2.958884 57   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst>=-1.434076 122  35 B (0.28688525 0.71311475)  
##       6) symmetry_worst>=-1.232339 8   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst< -1.232339 114  27 B (0.23684211 0.76315789)  
##        14) compactness_se>=-3.311998 12   4 M (0.66666667 0.33333333)  
##          28) texture_mean>=2.701935 10   2 M (0.80000000 0.20000000)  
##            56) smoothness_mean>=-2.314128 8   0 M (1.00000000 0.00000000) *
##            57) smoothness_mean< -2.314128 2   0 B (0.00000000 1.00000000) *
##          29) texture_mean< 2.701935 2   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -3.311998 102  19 B (0.18627451 0.81372549)  
##          30) smoothness_mean>=-1.977294 5   0 M (1.00000000 0.00000000) *
##          31) smoothness_mean< -1.977294 97  14 B (0.14432990 0.85567010)  
##            62) compactness_se< -3.475452 54  14 B (0.25925926 0.74074074)  
##             124) symmetry_worst< -1.534985 31  14 B (0.45161290 0.54838710) *
##             125) symmetry_worst>=-1.534985 23   0 B (0.00000000 1.00000000) *
##            63) compactness_se>=-3.475452 43   0 B (0.00000000 1.00000000) *
## 
## $trees[[91]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 406 B (0.44517544 0.55482456)  
##     2) texture_mean< 2.787462 98  29 M (0.70408163 0.29591837)  
##       4) compactness_se>=-3.891799 89  20 M (0.77528090 0.22471910)  
##         8) texture_worst>=4.056844 46   1 M (0.97826087 0.02173913)  
##          16) texture_mean>=2.709047 45   0 M (1.00000000 0.00000000) *
##          17) texture_mean< 2.709047 1   0 B (0.00000000 1.00000000) *
##         9) texture_worst< 4.056844 43  19 M (0.55813953 0.44186047)  
##          18) smoothness_worst>=-1.498451 26   5 M (0.80769231 0.19230769)  
##            36) smoothness_mean< -2.060513 22   1 M (0.95454545 0.04545455)  
##              72) texture_worst>=3.768766 21   0 M (1.00000000 0.00000000) *
##              73) texture_worst< 3.768766 1   0 B (0.00000000 1.00000000) *
##            37) smoothness_mean>=-2.060513 4   0 B (0.00000000 1.00000000) *
##          19) smoothness_worst< -1.498451 17   3 B (0.17647059 0.82352941)  
##            38) texture_mean>=2.758692 3   0 M (1.00000000 0.00000000) *
##            39) texture_mean< 2.758692 14   0 B (0.00000000 1.00000000) *
##       5) compactness_se< -3.891799 9   0 B (0.00000000 1.00000000) *
##     3) texture_mean>=2.787462 814 337 B (0.41400491 0.58599509)  
##       6) symmetry_worst< -2.49184 20   1 M (0.95000000 0.05000000)  
##        12) texture_mean< 3.276838 19   0 M (1.00000000 0.00000000) *
##        13) texture_mean>=3.276838 1   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst>=-2.49184 794 318 B (0.40050378 0.59949622)  
##        14) symmetry_worst>=-1.327359 27   5 M (0.81481481 0.18518519)  
##          28) smoothness_mean< -2.307926 18   0 M (1.00000000 0.00000000) *
##          29) smoothness_mean>=-2.307926 9   4 B (0.44444444 0.55555556)  
##            58) smoothness_mean>=-2.288752 5   1 M (0.80000000 0.20000000)  
##             116) compactness_se< -2.883911 4   0 M (1.00000000 0.00000000) *
##             117) compactness_se>=-2.883911 1   0 B (0.00000000 1.00000000) *
##            59) smoothness_mean< -2.288752 4   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.327359 767 296 B (0.38591917 0.61408083)  
##          30) smoothness_mean< -2.093138 744 295 B (0.39650538 0.60349462)  
##            60) smoothness_worst>=-1.460895 141  63 M (0.55319149 0.44680851)  
##             120) compactness_se>=-4.032549 72  17 M (0.76388889 0.23611111) *
##             121) compactness_se< -4.032549 69  23 B (0.33333333 0.66666667) *
##            61) smoothness_worst< -1.460895 603 217 B (0.35986733 0.64013267)  
##             122) smoothness_worst< -1.4768 529 210 B (0.39697543 0.60302457) *
##             123) smoothness_worst>=-1.4768 74   7 B (0.09459459 0.90540541) *
##          31) smoothness_mean>=-2.093138 23   1 B (0.04347826 0.95652174)  
##            62) texture_mean< 2.94627 1   0 M (1.00000000 0.00000000) *
##            63) texture_mean>=2.94627 22   0 B (0.00000000 1.00000000) *
## 
## $trees[[92]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 417 B (0.45723684 0.54276316)  
##     2) smoothness_worst< -1.532606 360 158 M (0.56111111 0.43888889)  
##       4) smoothness_worst>=-1.559144 117  29 M (0.75213675 0.24786325)  
##         8) smoothness_mean>=-2.48706 111  23 M (0.79279279 0.20720721)  
##          16) texture_mean>=2.862952 90  10 M (0.88888889 0.11111111)  
##            32) symmetry_worst>=-2.201537 80   5 M (0.93750000 0.06250000)  
##              64) compactness_se>=-4.694501 78   3 M (0.96153846 0.03846154) *
##              65) compactness_se< -4.694501 2   0 B (0.00000000 1.00000000) *
##            33) symmetry_worst< -2.201537 10   5 M (0.50000000 0.50000000)  
##              66) smoothness_mean< -2.414006 5   0 M (1.00000000 0.00000000) *
##              67) smoothness_mean>=-2.414006 5   0 B (0.00000000 1.00000000) *
##          17) texture_mean< 2.862952 21   8 B (0.38095238 0.61904762)  
##            34) smoothness_worst>=-1.547262 10   2 M (0.80000000 0.20000000)  
##              68) compactness_se>=-4.272056 8   0 M (1.00000000 0.00000000) *
##              69) compactness_se< -4.272056 2   0 B (0.00000000 1.00000000) *
##            35) smoothness_worst< -1.547262 11   0 B (0.00000000 1.00000000) *
##         9) smoothness_mean< -2.48706 6   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.559144 243 114 B (0.46913580 0.53086420)  
##        10) compactness_se< -4.579712 39   7 M (0.82051282 0.17948718)  
##          20) compactness_se>=-4.740419 32   0 M (1.00000000 0.00000000) *
##          21) compactness_se< -4.740419 7   0 B (0.00000000 1.00000000) *
##        11) compactness_se>=-4.579712 204  82 B (0.40196078 0.59803922)  
##          22) symmetry_worst>=-1.966444 138  66 M (0.52173913 0.47826087)  
##            44) compactness_se>=-4.260936 96  34 M (0.64583333 0.35416667)  
##              88) smoothness_worst< -1.568787 86  25 M (0.70930233 0.29069767) *
##              89) smoothness_worst>=-1.568787 10   1 B (0.10000000 0.90000000) *
##            45) compactness_se< -4.260936 42  10 B (0.23809524 0.76190476)  
##              90) smoothness_worst< -1.61379 12   2 M (0.83333333 0.16666667) *
##              91) smoothness_worst>=-1.61379 30   0 B (0.00000000 1.00000000) *
##          23) symmetry_worst< -1.966444 66  10 B (0.15151515 0.84848485)  
##            46) smoothness_worst< -1.694089 10   4 M (0.60000000 0.40000000)  
##              92) texture_mean>=3.03091 7   1 M (0.85714286 0.14285714) *
##              93) texture_mean< 3.03091 3   0 B (0.00000000 1.00000000) *
##            47) smoothness_worst>=-1.694089 56   4 B (0.07142857 0.92857143)  
##              94) compactness_se>=-2.674921 2   0 M (1.00000000 0.00000000) *
##              95) compactness_se< -2.674921 54   2 B (0.03703704 0.96296296) *
##     3) smoothness_worst>=-1.532606 552 215 B (0.38949275 0.61050725)  
##       6) smoothness_mean< -2.235394 433 190 B (0.43879908 0.56120092)  
##        12) texture_worst< 4.545516 132  47 M (0.64393939 0.35606061)  
##          24) smoothness_worst>=-1.520499 119  34 M (0.71428571 0.28571429)  
##            48) texture_mean>=2.892314 44   1 M (0.97727273 0.02272727)  
##              96) compactness_se>=-3.950529 39   0 M (1.00000000 0.00000000) *
##              97) compactness_se< -3.950529 5   1 M (0.80000000 0.20000000) *
##            49) texture_mean< 2.892314 75  33 M (0.56000000 0.44000000)  
##              98) smoothness_worst>=-1.496838 63  21 M (0.66666667 0.33333333) *
##              99) smoothness_worst< -1.496838 12   0 B (0.00000000 1.00000000) *
##          25) smoothness_worst< -1.520499 13   0 B (0.00000000 1.00000000) *
##        13) texture_worst>=4.545516 301 105 B (0.34883721 0.65116279)  
##          26) texture_mean>=2.975018 245  99 B (0.40408163 0.59591837)  
##            52) compactness_se< -3.500605 155  74 M (0.52258065 0.47741935)  
##             104) texture_mean< 3.210432 102  30 M (0.70588235 0.29411765) *
##             105) texture_mean>=3.210432 53   9 B (0.16981132 0.83018868) *
##            53) compactness_se>=-3.500605 90  18 B (0.20000000 0.80000000)  
##             106) smoothness_mean>=-2.281841 8   0 M (1.00000000 0.00000000) *
##             107) smoothness_mean< -2.281841 82  10 B (0.12195122 0.87804878) *
##          27) texture_mean< 2.975018 56   6 B (0.10714286 0.89285714)  
##            54) symmetry_worst>=-1.420115 3   0 M (1.00000000 0.00000000) *
##            55) symmetry_worst< -1.420115 53   3 B (0.05660377 0.94339623)  
##             110) texture_mean< 2.851854 2   0 M (1.00000000 0.00000000) *
##             111) texture_mean>=2.851854 51   1 B (0.01960784 0.98039216) *
##       7) smoothness_mean>=-2.235394 119  25 B (0.21008403 0.78991597)  
##        14) smoothness_mean>=-2.07745 8   1 M (0.87500000 0.12500000)  
##          28) symmetry_worst< -1.400188 7   0 M (1.00000000 0.00000000) *
##          29) symmetry_worst>=-1.400188 1   0 B (0.00000000 1.00000000) *
##        15) smoothness_mean< -2.07745 111  18 B (0.16216216 0.83783784)  
##          30) texture_mean>=3.044522 20  10 M (0.50000000 0.50000000)  
##            60) texture_mean< 3.184212 10   0 M (1.00000000 0.00000000) *
##            61) texture_mean>=3.184212 10   0 B (0.00000000 1.00000000) *
##          31) texture_mean< 3.044522 91   8 B (0.08791209 0.91208791)  
##            62) compactness_se< -3.492408 41   8 B (0.19512195 0.80487805)  
##             124) compactness_se>=-3.664511 3   0 M (1.00000000 0.00000000) *
##             125) compactness_se< -3.664511 38   5 B (0.13157895 0.86842105) *
##            63) compactness_se>=-3.492408 50   0 B (0.00000000 1.00000000) *
## 
## $trees[[93]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 440 B (0.48245614 0.51754386)  
##     2) smoothness_worst< -1.4768 618 282 M (0.54368932 0.45631068)  
##       4) smoothness_worst>=-1.604472 521 210 M (0.59692898 0.40307102)  
##         8) smoothness_worst>=-1.482699 51   6 M (0.88235294 0.11764706)  
##          16) compactness_se>=-3.894783 43   0 M (1.00000000 0.00000000) *
##          17) compactness_se< -3.894783 8   2 B (0.25000000 0.75000000)  
##            34) smoothness_mean< -2.282367 2   0 M (1.00000000 0.00000000) *
##            35) smoothness_mean>=-2.282367 6   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.482699 470 204 M (0.56595745 0.43404255)  
##          18) smoothness_worst< -1.484675 455 189 M (0.58461538 0.41538462)  
##            36) texture_mean>=3.034949 171  51 M (0.70175439 0.29824561)  
##              72) texture_worst< 4.797934 43   0 M (1.00000000 0.00000000) *
##              73) texture_worst>=4.797934 128  51 M (0.60156250 0.39843750) *
##            37) texture_mean< 3.034949 284 138 M (0.51408451 0.48591549)  
##              74) symmetry_worst< -2.111279 36   5 M (0.86111111 0.13888889) *
##              75) symmetry_worst>=-2.111279 248 115 B (0.46370968 0.53629032) *
##          19) smoothness_worst>=-1.484675 15   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.604472 97  25 B (0.25773196 0.74226804)  
##        10) symmetry_worst>=-1.550826 12   3 M (0.75000000 0.25000000)  
##          20) texture_mean>=2.967432 10   1 M (0.90000000 0.10000000)  
##            40) smoothness_mean>=-2.592204 9   0 M (1.00000000 0.00000000) *
##            41) smoothness_mean< -2.592204 1   0 B (0.00000000 1.00000000) *
##          21) texture_mean< 2.967432 2   0 B (0.00000000 1.00000000) *
##        11) symmetry_worst< -1.550826 85  16 B (0.18823529 0.81176471)  
##          22) texture_mean< 3.086027 41  16 B (0.39024390 0.60975610)  
##            44) texture_mean>=2.935975 27  11 M (0.59259259 0.40740741)  
##              88) smoothness_mean< -2.484925 23   7 M (0.69565217 0.30434783) *
##              89) smoothness_mean>=-2.484925 4   0 B (0.00000000 1.00000000) *
##            45) texture_mean< 2.935975 14   0 B (0.00000000 1.00000000) *
##          23) texture_mean>=3.086027 44   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst>=-1.4768 294 104 B (0.35374150 0.64625850)  
##       6) symmetry_worst>=-1.352813 27   5 M (0.81481481 0.18518519)  
##        12) smoothness_mean>=-2.365259 24   2 M (0.91666667 0.08333333)  
##          24) smoothness_mean< -2.003731 22   1 M (0.95454545 0.04545455)  
##            48) compactness_se< -2.646661 20   0 M (1.00000000 0.00000000) *
##            49) compactness_se>=-2.646661 2   1 M (0.50000000 0.50000000)  
##              98) texture_mean>=2.915767 1   0 M (1.00000000 0.00000000) *
##              99) texture_mean< 2.915767 1   0 B (0.00000000 1.00000000) *
##          25) smoothness_mean>=-2.003731 2   1 M (0.50000000 0.50000000)  
##            50) texture_mean>=2.823221 1   0 M (1.00000000 0.00000000) *
##            51) texture_mean< 2.823221 1   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean< -2.365259 3   0 B (0.00000000 1.00000000) *
##       7) symmetry_worst< -1.352813 267  82 B (0.30711610 0.69288390)  
##        14) texture_worst< 4.624204 97  47 B (0.48453608 0.51546392)  
##          28) texture_mean>=2.934384 36   2 M (0.94444444 0.05555556)  
##            56) compactness_se>=-4.35833 34   0 M (1.00000000 0.00000000) *
##            57) compactness_se< -4.35833 2   0 B (0.00000000 1.00000000) *
##          29) texture_mean< 2.934384 61  13 B (0.21311475 0.78688525)  
##            58) texture_mean< 2.754252 13   4 M (0.69230769 0.30769231)  
##             116) texture_mean>=2.728421 7   0 M (1.00000000 0.00000000) *
##             117) texture_mean< 2.728421 6   2 B (0.33333333 0.66666667) *
##            59) texture_mean>=2.754252 48   4 B (0.08333333 0.91666667)  
##             118) smoothness_worst>=-1.449274 14   4 B (0.28571429 0.71428571) *
##             119) smoothness_worst< -1.449274 34   0 B (0.00000000 1.00000000) *
##        15) texture_worst>=4.624204 170  35 B (0.20588235 0.79411765)  
##          30) texture_worst>=4.824912 83  33 B (0.39759036 0.60240964)  
##            60) symmetry_worst< -1.820896 11   0 M (1.00000000 0.00000000) *
##            61) symmetry_worst>=-1.820896 72  22 B (0.30555556 0.69444444)  
##             122) symmetry_worst>=-1.655812 32  11 M (0.65625000 0.34375000) *
##             123) symmetry_worst< -1.655812 40   1 B (0.02500000 0.97500000) *
##          31) texture_worst< 4.824912 87   2 B (0.02298851 0.97701149)  
##            62) compactness_se< -4.222024 1   0 M (1.00000000 0.00000000) *
##            63) compactness_se>=-4.222024 86   1 B (0.01162791 0.98837209)  
##             126) compactness_se>=-3.370923 1   0 M (1.00000000 0.00000000) *
##             127) compactness_se< -3.370923 85   0 B (0.00000000 1.00000000) *
## 
## $trees[[94]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 433 B (0.47478070 0.52521930)  
##     2) texture_worst< 4.642157 458 195 M (0.57423581 0.42576419)  
##       4) compactness_se< -3.344528 362 133 M (0.63259669 0.36740331)  
##         8) symmetry_worst< -1.692017 264  75 M (0.71590909 0.28409091)  
##          16) symmetry_worst>=-1.815934 114  11 M (0.90350877 0.09649123)  
##            32) compactness_se>=-4.45131 112   9 M (0.91964286 0.08035714)  
##              64) smoothness_mean< -2.188699 111   8 M (0.92792793 0.07207207) *
##              65) smoothness_mean>=-2.188699 1   0 B (0.00000000 1.00000000) *
##            33) compactness_se< -4.45131 2   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst< -1.815934 150  64 M (0.57333333 0.42666667)  
##            34) smoothness_mean>=-2.419122 121  40 M (0.66942149 0.33057851)  
##              68) symmetry_worst< -1.824299 109  29 M (0.73394495 0.26605505) *
##              69) symmetry_worst>=-1.824299 12   1 B (0.08333333 0.91666667) *
##            35) smoothness_mean< -2.419122 29   5 B (0.17241379 0.82758621)  
##              70) compactness_se>=-3.514597 10   5 M (0.50000000 0.50000000) *
##              71) compactness_se< -3.514597 19   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.692017 98  40 B (0.40816327 0.59183673)  
##          18) smoothness_worst>=-1.451541 19   3 M (0.84210526 0.15789474)  
##            36) smoothness_worst< -1.42057 13   0 M (1.00000000 0.00000000) *
##            37) smoothness_worst>=-1.42057 6   3 M (0.50000000 0.50000000)  
##              74) texture_mean>=2.688296 3   0 M (1.00000000 0.00000000) *
##              75) texture_mean< 2.688296 3   0 B (0.00000000 1.00000000) *
##          19) smoothness_worst< -1.451541 79  24 B (0.30379747 0.69620253)  
##            38) compactness_se< -4.681232 14   3 M (0.78571429 0.21428571)  
##              76) texture_mean< 2.936149 11   0 M (1.00000000 0.00000000) *
##              77) texture_mean>=2.936149 3   0 B (0.00000000 1.00000000) *
##            39) compactness_se>=-4.681232 65  13 B (0.20000000 0.80000000)  
##              78) smoothness_mean>=-2.170258 6   0 M (1.00000000 0.00000000) *
##              79) smoothness_mean< -2.170258 59   7 B (0.11864407 0.88135593) *
##       5) compactness_se>=-3.344528 96  34 B (0.35416667 0.64583333)  
##        10) symmetry_worst>=-1.001713 16   0 M (1.00000000 0.00000000) *
##        11) symmetry_worst< -1.001713 80  18 B (0.22500000 0.77500000)  
##          22) texture_mean>=3.023554 9   1 M (0.88888889 0.11111111)  
##            44) texture_worst< 4.544 8   0 M (1.00000000 0.00000000) *
##            45) texture_worst>=4.544 1   0 B (0.00000000 1.00000000) *
##          23) texture_mean< 3.023554 71  10 B (0.14084507 0.85915493)  
##            46) smoothness_worst>=-1.500893 29  10 B (0.34482759 0.65517241)  
##              92) texture_mean>=2.915992 9   0 M (1.00000000 0.00000000) *
##              93) texture_mean< 2.915992 20   1 B (0.05000000 0.95000000) *
##            47) smoothness_worst< -1.500893 42   0 B (0.00000000 1.00000000) *
##     3) texture_worst>=4.642157 454 170 B (0.37444934 0.62555066)  
##       6) compactness_se>=-3.334337 80  29 M (0.63750000 0.36250000)  
##        12) smoothness_mean>=-2.338127 34   0 M (1.00000000 0.00000000) *
##        13) smoothness_mean< -2.338127 46  17 B (0.36956522 0.63043478)  
##          26) texture_worst>=4.993407 16   0 M (1.00000000 0.00000000) *
##          27) texture_worst< 4.993407 30   1 B (0.03333333 0.96666667)  
##            54) texture_worst< 4.684099 4   1 B (0.25000000 0.75000000)  
##             108) texture_mean>=3.079461 1   0 M (1.00000000 0.00000000) *
##             109) texture_mean< 3.079461 3   0 B (0.00000000 1.00000000) *
##            55) texture_worst>=4.684099 26   0 B (0.00000000 1.00000000) *
##       7) compactness_se< -3.334337 374 119 B (0.31818182 0.68181818)  
##        14) smoothness_mean>=-2.403622 255  98 B (0.38431373 0.61568627)  
##          28) smoothness_mean< -2.382712 33   3 M (0.90909091 0.09090909)  
##            56) symmetry_worst>=-2.212871 31   1 M (0.96774194 0.03225806)  
##             112) texture_mean>=2.920077 30   0 M (1.00000000 0.00000000) *
##             113) texture_mean< 2.920077 1   0 B (0.00000000 1.00000000) *
##            57) symmetry_worst< -2.212871 2   0 B (0.00000000 1.00000000) *
##          29) smoothness_mean>=-2.382712 222  68 B (0.30630631 0.69369369)  
##            58) texture_worst>=4.818867 126  53 B (0.42063492 0.57936508)  
##             116) symmetry_worst>=-1.71268 40  11 M (0.72500000 0.27500000) *
##             117) symmetry_worst< -1.71268 86  24 B (0.27906977 0.72093023) *
##            59) texture_worst< 4.818867 96  15 B (0.15625000 0.84375000)  
##             118) smoothness_mean< -2.353585 9   2 M (0.77777778 0.22222222) *
##             119) smoothness_mean>=-2.353585 87   8 B (0.09195402 0.90804598) *
##        15) smoothness_mean< -2.403622 119  21 B (0.17647059 0.82352941)  
##          30) smoothness_mean< -2.443746 66  21 B (0.31818182 0.68181818)  
##            60) smoothness_mean>=-2.450864 8   0 M (1.00000000 0.00000000) *
##            61) smoothness_mean< -2.450864 58  13 B (0.22413793 0.77586207)  
##             122) compactness_se< -4.620161 13   5 M (0.61538462 0.38461538) *
##             123) compactness_se>=-4.620161 45   5 B (0.11111111 0.88888889) *
##          31) smoothness_mean>=-2.443746 53   0 B (0.00000000 1.00000000) *
## 
## $trees[[95]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 448 B (0.49122807 0.50877193)  
##     2) smoothness_worst>=-1.609811 812 389 M (0.52093596 0.47906404)  
##       4) smoothness_worst< -1.533657 271  99 M (0.63468635 0.36531365)  
##         8) compactness_se>=-3.545992 67   9 M (0.86567164 0.13432836)  
##          16) texture_worst>=4.411908 50   2 M (0.96000000 0.04000000)  
##            32) symmetry_worst>=-1.956813 41   0 M (1.00000000 0.00000000) *
##            33) symmetry_worst< -1.956813 9   2 M (0.77777778 0.22222222)  
##              66) texture_mean>=3.044039 7   0 M (1.00000000 0.00000000) *
##              67) texture_mean< 3.044039 2   0 B (0.00000000 1.00000000) *
##          17) texture_worst< 4.411908 17   7 M (0.58823529 0.41176471)  
##            34) compactness_se< -3.464112 10   0 M (1.00000000 0.00000000) *
##            35) compactness_se>=-3.464112 7   0 B (0.00000000 1.00000000) *
##         9) compactness_se< -3.545992 204  90 M (0.55882353 0.44117647)  
##          18) smoothness_mean>=-2.501158 193  79 M (0.59067358 0.40932642)  
##            36) symmetry_worst>=-2.063958 159  55 M (0.65408805 0.34591195)  
##              72) texture_mean>=2.983598 74   8 M (0.89189189 0.10810811) *
##              73) texture_mean< 2.983598 85  38 B (0.44705882 0.55294118) *
##            37) symmetry_worst< -2.063958 34  10 B (0.29411765 0.70588235)  
##              74) smoothness_worst< -1.594361 10   0 M (1.00000000 0.00000000) *
##              75) smoothness_worst>=-1.594361 24   0 B (0.00000000 1.00000000) *
##          19) smoothness_mean< -2.501158 11   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.533657 541 251 B (0.46395564 0.53604436)  
##        10) smoothness_worst>=-1.52382 490 244 M (0.50204082 0.49795918)  
##          20) compactness_se>=-4.547852 463 217 M (0.53131749 0.46868251)  
##            40) symmetry_worst>=-1.529476 85  20 M (0.76470588 0.23529412)  
##              80) smoothness_mean>=-2.343303 75  10 M (0.86666667 0.13333333) *
##              81) smoothness_mean< -2.343303 10   0 B (0.00000000 1.00000000) *
##            41) symmetry_worst< -1.529476 378 181 B (0.47883598 0.52116402)  
##              82) smoothness_mean< -2.299091 187  73 M (0.60962567 0.39037433) *
##              83) smoothness_mean>=-2.299091 191  67 B (0.35078534 0.64921466) *
##          21) compactness_se< -4.547852 27   0 B (0.00000000 1.00000000) *
##        11) smoothness_worst< -1.52382 51   5 B (0.09803922 0.90196078)  
##          22) texture_mean>=3.084198 13   5 B (0.38461538 0.61538462)  
##            44) texture_worst< 4.870528 5   0 M (1.00000000 0.00000000) *
##            45) texture_worst>=4.870528 8   0 B (0.00000000 1.00000000) *
##          23) texture_mean< 3.084198 38   0 B (0.00000000 1.00000000) *
##     3) smoothness_worst< -1.609811 100  25 B (0.25000000 0.75000000)  
##       6) symmetry_worst>=-1.550826 8   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst< -1.550826 92  17 B (0.18478261 0.81521739)  
##        14) texture_mean< 3.086027 58  17 B (0.29310345 0.70689655)  
##          28) texture_worst>=4.818554 6   0 M (1.00000000 0.00000000) *
##          29) texture_worst< 4.818554 52  11 B (0.21153846 0.78846154)  
##            58) symmetry_worst>=-1.627715 5   0 M (1.00000000 0.00000000) *
##            59) symmetry_worst< -1.627715 47   6 B (0.12765957 0.87234043)  
##             118) texture_mean>=3.078218 4   0 M (1.00000000 0.00000000) *
##             119) texture_mean< 3.078218 43   2 B (0.04651163 0.95348837) *
##        15) texture_mean>=3.086027 34   0 B (0.00000000 1.00000000) *
## 
## $trees[[96]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 446 M (0.51096491 0.48903509)  
##     2) texture_worst< 4.782287 617 268 M (0.56564019 0.43435981)  
##       4) symmetry_worst>=-1.834844 434 145 M (0.66589862 0.33410138)  
##         8) symmetry_worst< -1.69453 216  48 M (0.77777778 0.22222222)  
##          16) smoothness_worst>=-1.587787 194  33 M (0.82989691 0.17010309)  
##            32) texture_mean>=2.891759 106   5 M (0.95283019 0.04716981)  
##              64) smoothness_mean>=-2.450833 103   2 M (0.98058252 0.01941748) *
##              65) smoothness_mean< -2.450833 3   0 B (0.00000000 1.00000000) *
##            33) texture_mean< 2.891759 88  28 M (0.68181818 0.31818182)  
##              66) symmetry_worst< -1.801798 38   0 M (1.00000000 0.00000000) *
##              67) symmetry_worst>=-1.801798 50  22 B (0.44000000 0.56000000) *
##          17) smoothness_worst< -1.587787 22   7 B (0.31818182 0.68181818)  
##            34) symmetry_worst>=-1.787851 9   2 M (0.77777778 0.22222222)  
##              68) texture_mean>=2.935975 7   0 M (1.00000000 0.00000000) *
##              69) texture_mean< 2.935975 2   0 B (0.00000000 1.00000000) *
##            35) symmetry_worst< -1.787851 13   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst>=-1.69453 218  97 M (0.55504587 0.44495413)  
##          18) symmetry_worst>=-1.656986 203  82 M (0.59605911 0.40394089)  
##            36) texture_mean>=3.061712 40   6 M (0.85000000 0.15000000)  
##              72) smoothness_worst>=-1.609426 34   0 M (1.00000000 0.00000000) *
##              73) smoothness_worst< -1.609426 6   0 B (0.00000000 1.00000000) *
##            37) texture_mean< 3.061712 163  76 M (0.53374233 0.46625767)  
##              74) texture_mean< 3.04903 149  62 M (0.58389262 0.41610738) *
##              75) texture_mean>=3.04903 14   0 B (0.00000000 1.00000000) *
##          19) symmetry_worst< -1.656986 15   0 B (0.00000000 1.00000000) *
##       5) symmetry_worst< -1.834844 183  60 B (0.32786885 0.67213115)  
##        10) smoothness_worst< -1.542557 65  29 M (0.55384615 0.44615385)  
##          20) smoothness_worst>=-1.559148 19   0 M (1.00000000 0.00000000) *
##          21) smoothness_worst< -1.559148 46  17 B (0.36956522 0.63043478)  
##            42) compactness_se>=-3.604909 14   4 M (0.71428571 0.28571429)  
##              84) symmetry_worst< -1.953067 12   2 M (0.83333333 0.16666667) *
##              85) symmetry_worst>=-1.953067 2   0 B (0.00000000 1.00000000) *
##            43) compactness_se< -3.604909 32   7 B (0.21875000 0.78125000)  
##              86) compactness_se< -4.563271 5   1 M (0.80000000 0.20000000) *
##              87) compactness_se>=-4.563271 27   3 B (0.11111111 0.88888889) *
##        11) smoothness_worst>=-1.542557 118  24 B (0.20338983 0.79661017)  
##          22) smoothness_worst>=-1.497846 54  22 B (0.40740741 0.59259259)  
##            44) smoothness_worst< -1.476215 20   4 M (0.80000000 0.20000000)  
##              88) compactness_se>=-3.79429 14   0 M (1.00000000 0.00000000) *
##              89) compactness_se< -3.79429 6   2 B (0.33333333 0.66666667) *
##            45) smoothness_worst>=-1.476215 34   6 B (0.17647059 0.82352941)  
##              90) smoothness_worst>=-1.424105 7   2 M (0.71428571 0.28571429) *
##              91) smoothness_worst< -1.424105 27   1 B (0.03703704 0.96296296) *
##          23) smoothness_worst< -1.497846 64   2 B (0.03125000 0.96875000)  
##            46) texture_worst>=4.649493 2   0 M (1.00000000 0.00000000) *
##            47) texture_worst< 4.649493 62   0 B (0.00000000 1.00000000) *
##     3) texture_worst>=4.782287 295 117 B (0.39661017 0.60338983)  
##       6) texture_worst>=4.911888 199  98 M (0.50753769 0.49246231)  
##        12) symmetry_worst< -1.733593 104  27 M (0.74038462 0.25961538)  
##          24) symmetry_worst>=-2.207988 85  11 M (0.87058824 0.12941176)  
##            48) compactness_se>=-4.706178 81   7 M (0.91358025 0.08641975)  
##              96) smoothness_mean< -2.140427 79   5 M (0.93670886 0.06329114) *
##              97) smoothness_mean>=-2.140427 2   0 B (0.00000000 1.00000000) *
##            49) compactness_se< -4.706178 4   0 B (0.00000000 1.00000000) *
##          25) symmetry_worst< -2.207988 19   3 B (0.15789474 0.84210526)  
##            50) compactness_se>=-3.413706 3   0 M (1.00000000 0.00000000) *
##            51) compactness_se< -3.413706 16   0 B (0.00000000 1.00000000) *
##        13) symmetry_worst>=-1.733593 95  24 B (0.25263158 0.74736842)  
##          26) smoothness_worst>=-1.426681 11   0 M (1.00000000 0.00000000) *
##          27) smoothness_worst< -1.426681 84  13 B (0.15476190 0.84523810)  
##            54) texture_worst< 4.941163 3   0 M (1.00000000 0.00000000) *
##            55) texture_worst>=4.941163 81  10 B (0.12345679 0.87654321)  
##             110) texture_worst>=5.003123 38  10 B (0.26315789 0.73684211) *
##             111) texture_worst< 5.003123 43   0 B (0.00000000 1.00000000) *
##       7) texture_worst< 4.911888 96  16 B (0.16666667 0.83333333)  
##        14) compactness_se>=-3.601962 25  12 B (0.48000000 0.52000000)  
##          28) smoothness_worst< -1.506135 11   1 M (0.90909091 0.09090909)  
##            56) smoothness_mean>=-2.51419 10   0 M (1.00000000 0.00000000) *
##            57) smoothness_mean< -2.51419 1   0 B (0.00000000 1.00000000) *
##          29) smoothness_worst>=-1.506135 14   2 B (0.14285714 0.85714286)  
##            58) smoothness_mean>=-2.272702 1   0 M (1.00000000 0.00000000) *
##            59) smoothness_mean< -2.272702 13   1 B (0.07692308 0.92307692)  
##             118) compactness_se< -3.500605 1   0 M (1.00000000 0.00000000) *
##             119) compactness_se>=-3.500605 12   0 B (0.00000000 1.00000000) *
##        15) compactness_se< -3.601962 71   4 B (0.05633803 0.94366197)  
##          30) symmetry_worst>=-1.35602 3   0 M (1.00000000 0.00000000) *
##          31) symmetry_worst< -1.35602 68   1 B (0.01470588 0.98529412)  
##            62) smoothness_mean< -2.480592 4   1 B (0.25000000 0.75000000)  
##             124) texture_mean< 3.136036 1   0 M (1.00000000 0.00000000) *
##             125) texture_mean>=3.136036 3   0 B (0.00000000 1.00000000) *
##            63) smoothness_mean>=-2.480592 64   0 B (0.00000000 1.00000000) *
## 
## $trees[[97]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 410 B (0.44956140 0.55043860)  
##     2) symmetry_worst>=-2.031981 769 371 B (0.48244473 0.51755527)  
##       4) smoothness_worst< -1.4768 498 227 M (0.54417671 0.45582329)  
##         8) smoothness_worst>=-1.482699 42   0 M (1.00000000 0.00000000) *
##         9) smoothness_worst< -1.482699 456 227 M (0.50219298 0.49780702)  
##          18) smoothness_worst>=-1.584838 362 161 M (0.55524862 0.44475138)  
##            36) symmetry_worst< -1.730674 175  56 M (0.68000000 0.32000000)  
##              72) compactness_se< -3.93685 92  15 M (0.83695652 0.16304348) *
##              73) compactness_se>=-3.93685 83  41 M (0.50602410 0.49397590) *
##            37) symmetry_worst>=-1.730674 187  82 B (0.43850267 0.56149733)  
##              74) compactness_se>=-3.681558 83  28 M (0.66265060 0.33734940) *
##              75) compactness_se< -3.681558 104  27 B (0.25961538 0.74038462) *
##          19) smoothness_worst< -1.584838 94  28 B (0.29787234 0.70212766)  
##            38) smoothness_mean< -2.473852 52  25 B (0.48076923 0.51923077)  
##              76) smoothness_mean>=-2.507153 26   5 M (0.80769231 0.19230769) *
##              77) smoothness_mean< -2.507153 26   4 B (0.15384615 0.84615385) *
##            39) smoothness_mean>=-2.473852 42   3 B (0.07142857 0.92857143)  
##              78) symmetry_worst>=-1.550826 4   1 M (0.75000000 0.25000000) *
##              79) symmetry_worst< -1.550826 38   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst>=-1.4768 271 100 B (0.36900369 0.63099631)  
##        10) smoothness_worst>=-1.472112 220  99 B (0.45000000 0.55000000)  
##          20) symmetry_worst>=-1.75757 181  88 M (0.51381215 0.48618785)  
##            40) compactness_se>=-4.032549 108  36 M (0.66666667 0.33333333)  
##              80) texture_worst>=4.40818 62   7 M (0.88709677 0.11290323) *
##              81) texture_worst< 4.40818 46  17 B (0.36956522 0.63043478) *
##            41) compactness_se< -4.032549 73  21 B (0.28767123 0.71232877)  
##              82) symmetry_worst< -1.743442 9   0 M (1.00000000 0.00000000) *
##              83) symmetry_worst>=-1.743442 64  12 B (0.18750000 0.81250000) *
##          21) symmetry_worst< -1.75757 39   6 B (0.15384615 0.84615385)  
##            42) texture_worst>=5.041355 3   0 M (1.00000000 0.00000000) *
##            43) texture_worst< 5.041355 36   3 B (0.08333333 0.91666667)  
##              86) compactness_se< -3.791636 5   2 B (0.40000000 0.60000000) *
##              87) compactness_se>=-3.791636 31   1 B (0.03225806 0.96774194) *
##        11) smoothness_worst< -1.472112 51   1 B (0.01960784 0.98039216)  
##          22) texture_mean>=3.069079 1   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.069079 50   0 B (0.00000000 1.00000000) *
##     3) symmetry_worst< -2.031981 143  39 B (0.27272727 0.72727273)  
##       6) symmetry_worst< -2.813177 9   0 M (1.00000000 0.00000000) *
##       7) symmetry_worst>=-2.813177 134  30 B (0.22388060 0.77611940)  
##        14) smoothness_worst< -1.720903 6   2 M (0.66666667 0.33333333)  
##          28) texture_mean< 3.103494 4   0 M (1.00000000 0.00000000) *
##          29) texture_mean>=3.103494 2   0 B (0.00000000 1.00000000) *
##        15) smoothness_worst>=-1.720903 128  26 B (0.20312500 0.79687500)  
##          30) smoothness_worst>=-1.448989 2   0 M (1.00000000 0.00000000) *
##          31) smoothness_worst< -1.448989 126  24 B (0.19047619 0.80952381)  
##            62) symmetry_worst< -2.384404 10   5 M (0.50000000 0.50000000)  
##             124) symmetry_worst>=-2.522371 5   0 M (1.00000000 0.00000000) *
##             125) symmetry_worst< -2.522371 5   0 B (0.00000000 1.00000000) *
##            63) symmetry_worst>=-2.384404 116  19 B (0.16379310 0.83620690)  
##             126) smoothness_mean< -2.334592 72  18 B (0.25000000 0.75000000) *
##             127) smoothness_mean>=-2.334592 44   1 B (0.02272727 0.97727273) *
## 
## $trees[[98]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 912 433 B (0.47478070 0.52521930)  
##    2) compactness_se< -3.132769 817 408 M (0.50061200 0.49938800)  
##      4) symmetry_worst< -1.503393 719 338 M (0.52990264 0.47009736)  
##        8) compactness_se>=-3.439211 78  17 M (0.78205128 0.21794872)  
##         16) smoothness_mean< -2.231223 69   9 M (0.86956522 0.13043478)  
##           32) texture_worst>=4.40102 48   2 M (0.95833333 0.04166667)  
##             64) smoothness_worst>=-1.579519 44   0 M (1.00000000 0.00000000) *
##             65) smoothness_worst< -1.579519 4   2 M (0.50000000 0.50000000) *
##           33) texture_worst< 4.40102 21   7 M (0.66666667 0.33333333)  
##             66) compactness_se< -3.420409 14   0 M (1.00000000 0.00000000) *
##             67) compactness_se>=-3.420409 7   0 B (0.00000000 1.00000000) *
##         17) smoothness_mean>=-2.231223 9   1 B (0.11111111 0.88888889)  
##           34) texture_mean>=3.004098 2   1 M (0.50000000 0.50000000)  
##             68) texture_mean< 3.140897 1   0 M (1.00000000 0.00000000) *
##             69) texture_mean>=3.140897 1   0 B (0.00000000 1.00000000) *
##           35) texture_mean< 3.004098 7   0 B (0.00000000 1.00000000) *
##        9) compactness_se< -3.439211 641 320 B (0.49921997 0.50078003)  
##         18) compactness_se< -3.492659 563 257 M (0.54351687 0.45648313)  
##           36) smoothness_worst>=-1.424105 29   1 M (0.96551724 0.03448276)  
##             72) compactness_se>=-4.089202 28   0 M (1.00000000 0.00000000) *
##             73) compactness_se< -4.089202 1   0 B (0.00000000 1.00000000) *
##           37) smoothness_worst< -1.424105 534 256 M (0.52059925 0.47940075)  
##             74) texture_mean< 2.84432 95  24 M (0.74736842 0.25263158) *
##             75) texture_mean>=2.84432 439 207 B (0.47152620 0.52847380) *
##         19) compactness_se>=-3.492659 78  14 B (0.17948718 0.82051282)  
##           38) smoothness_worst< -1.542472 15   5 M (0.66666667 0.33333333)  
##             76) smoothness_worst>=-1.618016 10   0 M (1.00000000 0.00000000) *
##             77) smoothness_worst< -1.618016 5   0 B (0.00000000 1.00000000) *
##           39) smoothness_worst>=-1.542472 63   4 B (0.06349206 0.93650794)  
##             78) texture_mean>=3.061712 5   2 M (0.60000000 0.40000000) *
##             79) texture_mean< 3.061712 58   1 B (0.01724138 0.98275862) *
##      5) symmetry_worst>=-1.503393 98  28 B (0.28571429 0.71428571)  
##       10) symmetry_worst>=-1.322543 25   8 M (0.68000000 0.32000000)  
##         20) texture_mean>=2.763873 21   4 M (0.80952381 0.19047619)  
##           40) smoothness_worst>=-1.496291 16   0 M (1.00000000 0.00000000) *
##           41) smoothness_worst< -1.496291 5   1 B (0.20000000 0.80000000)  
##             82) texture_mean>=3.158816 1   0 M (1.00000000 0.00000000) *
##             83) texture_mean< 3.158816 4   0 B (0.00000000 1.00000000) *
##         21) texture_mean< 2.763873 4   0 B (0.00000000 1.00000000) *
##       11) symmetry_worst< -1.322543 73  11 B (0.15068493 0.84931507)  
##         22) smoothness_mean< -2.370743 20   9 B (0.45000000 0.55000000)  
##           44) smoothness_mean>=-2.425324 7   0 M (1.00000000 0.00000000) *
##           45) smoothness_mean< -2.425324 13   2 B (0.15384615 0.84615385)  
##             90) texture_mean>=2.97943 2   0 M (1.00000000 0.00000000) *
##             91) texture_mean< 2.97943 11   0 B (0.00000000 1.00000000) *
##         23) smoothness_mean>=-2.370743 53   2 B (0.03773585 0.96226415)  
##           46) compactness_se>=-3.807621 12   2 B (0.16666667 0.83333333)  
##             92) compactness_se< -3.597317 2   0 M (1.00000000 0.00000000) *
##             93) compactness_se>=-3.597317 10   0 B (0.00000000 1.00000000) *
##           47) compactness_se< -3.807621 41   0 B (0.00000000 1.00000000) *
##    3) compactness_se>=-3.132769 95  24 B (0.25263158 0.74736842)  
##      6) smoothness_mean>=-2.291354 39  17 M (0.56410256 0.43589744)  
##       12) smoothness_worst>=-1.502935 29   7 M (0.75862069 0.24137931)  
##         24) compactness_se< -2.552001 21   0 M (1.00000000 0.00000000) *
##         25) compactness_se>=-2.552001 8   1 B (0.12500000 0.87500000)  
##           50) texture_mean>=2.929061 1   0 M (1.00000000 0.00000000) *
##           51) texture_mean< 2.929061 7   0 B (0.00000000 1.00000000) *
##       13) smoothness_worst< -1.502935 10   0 B (0.00000000 1.00000000) *
##      7) smoothness_mean< -2.291354 56   2 B (0.03571429 0.96428571)  
##       14) smoothness_worst< -1.720903 2   1 M (0.50000000 0.50000000)  
##         28) texture_mean< 3.103494 1   0 M (1.00000000 0.00000000) *
##         29) texture_mean>=3.103494 1   0 B (0.00000000 1.00000000) *
##       15) smoothness_worst>=-1.720903 54   1 B (0.01851852 0.98148148)  
##         30) texture_mean>=3.083423 15   1 B (0.06666667 0.93333333)  
##           60) texture_mean< 3.109209 1   0 M (1.00000000 0.00000000) *
##           61) texture_mean>=3.109209 14   0 B (0.00000000 1.00000000) *
##         31) texture_mean< 3.083423 39   0 B (0.00000000 1.00000000) *
## 
## $trees[[99]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 430 M (0.52850877 0.47149123)  
##     2) texture_worst>=4.580648 508 204 M (0.59842520 0.40157480)  
##       4) texture_worst< 4.642157 98  12 M (0.87755102 0.12244898)  
##         8) smoothness_worst>=-1.594449 91   6 M (0.93406593 0.06593407)  
##          16) compactness_se>=-4.694501 88   3 M (0.96590909 0.03409091)  
##            32) symmetry_worst>=-2.477165 86   1 M (0.98837209 0.01162791)  
##              64) smoothness_mean< -2.227061 65   0 M (1.00000000 0.00000000) *
##              65) smoothness_mean>=-2.227061 21   1 M (0.95238095 0.04761905) *
##            33) symmetry_worst< -2.477165 2   0 B (0.00000000 1.00000000) *
##          17) compactness_se< -4.694501 3   0 B (0.00000000 1.00000000) *
##         9) smoothness_worst< -1.594449 7   1 B (0.14285714 0.85714286)  
##          18) smoothness_mean< -2.558761 1   0 M (1.00000000 0.00000000) *
##          19) smoothness_mean>=-2.558761 6   0 B (0.00000000 1.00000000) *
##       5) texture_worst>=4.642157 410 192 M (0.53170732 0.46829268)  
##        10) texture_worst>=4.681966 369 155 M (0.57994580 0.42005420)  
##          20) smoothness_worst< -1.452126 292 107 M (0.63356164 0.36643836)  
##            40) smoothness_worst>=-1.50249 100  16 M (0.84000000 0.16000000)  
##              80) smoothness_mean>=-2.339781 64   3 M (0.95312500 0.04687500) *
##              81) smoothness_mean< -2.339781 36  13 M (0.63888889 0.36111111) *
##            41) smoothness_worst< -1.50249 192  91 M (0.52604167 0.47395833)  
##              82) symmetry_worst< -2.121358 28   3 M (0.89285714 0.10714286) *
##              83) symmetry_worst>=-2.121358 164  76 B (0.46341463 0.53658537) *
##          21) smoothness_worst>=-1.452126 77  29 B (0.37662338 0.62337662)  
##            42) compactness_se>=-4.032549 33  12 M (0.63636364 0.36363636)  
##              84) compactness_se< -3.425387 24   4 M (0.83333333 0.16666667) *
##              85) compactness_se>=-3.425387 9   1 B (0.11111111 0.88888889) *
##            43) compactness_se< -4.032549 44   8 B (0.18181818 0.81818182)  
##              86) smoothness_worst>=-1.425207 5   1 M (0.80000000 0.20000000) *
##              87) smoothness_worst< -1.425207 39   4 B (0.10256410 0.89743590) *
##        11) texture_worst< 4.681966 41   4 B (0.09756098 0.90243902)  
##          22) texture_mean>=3.067341 4   0 M (1.00000000 0.00000000) *
##          23) texture_mean< 3.067341 37   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.580648 404 178 B (0.44059406 0.55940594)  
##       6) texture_worst< 4.543638 341 169 M (0.50439883 0.49560117)  
##        12) smoothness_mean< -2.411844 86  23 M (0.73255814 0.26744186)  
##          24) compactness_se< -3.483667 75  12 M (0.84000000 0.16000000)  
##            48) symmetry_worst>=-2.079923 71   8 M (0.88732394 0.11267606)  
##              96) texture_mean>=2.745392 68   5 M (0.92647059 0.07352941) *
##              97) texture_mean< 2.745392 3   0 B (0.00000000 1.00000000) *
##            49) symmetry_worst< -2.079923 4   0 B (0.00000000 1.00000000) *
##          25) compactness_se>=-3.483667 11   0 B (0.00000000 1.00000000) *
##        13) smoothness_mean>=-2.411844 255 109 B (0.42745098 0.57254902)  
##          26) compactness_se>=-3.891799 149  62 M (0.58389262 0.41610738)  
##            52) smoothness_worst>=-1.503711 93  27 M (0.70967742 0.29032258)  
##             104) smoothness_mean< -2.271585 29   0 M (1.00000000 0.00000000) *
##             105) smoothness_mean>=-2.271585 64  27 M (0.57812500 0.42187500) *
##            53) smoothness_worst< -1.503711 56  21 B (0.37500000 0.62500000)  
##             106) smoothness_worst< -1.565495 19   4 M (0.78947368 0.21052632) *
##             107) smoothness_worst>=-1.565495 37   6 B (0.16216216 0.83783784) *
##          27) compactness_se< -3.891799 106  22 B (0.20754717 0.79245283)  
##            54) texture_worst>=4.531936 9   0 M (1.00000000 0.00000000) *
##            55) texture_worst< 4.531936 97  13 B (0.13402062 0.86597938)  
##             110) symmetry_worst< -2.49184 4   0 M (1.00000000 0.00000000) *
##             111) symmetry_worst>=-2.49184 93   9 B (0.09677419 0.90322581) *
##       7) texture_worst>=4.543638 63   6 B (0.09523810 0.90476190)  
##        14) texture_mean>=3.07959 3   0 M (1.00000000 0.00000000) *
##        15) texture_mean< 3.07959 60   3 B (0.05000000 0.95000000)  
##          30) smoothness_mean< -2.486703 3   1 M (0.66666667 0.33333333)  
##            60) texture_mean>=2.935975 2   0 M (1.00000000 0.00000000) *
##            61) texture_mean< 2.935975 1   0 B (0.00000000 1.00000000) *
##          31) smoothness_mean>=-2.486703 57   1 B (0.01754386 0.98245614)  
##            62) compactness_se>=-3.096414 1   0 M (1.00000000 0.00000000) *
##            63) compactness_se< -3.096414 56   0 B (0.00000000 1.00000000) *
## 
## $trees[[100]]
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 412 M (0.54824561 0.45175439)  
##     2) texture_worst>=4.824912 274  84 M (0.69343066 0.30656934)  
##       4) smoothness_worst>=-1.623453 260  71 M (0.72692308 0.27307692)  
##         8) symmetry_worst>=-2.207988 236  53 M (0.77542373 0.22457627)  
##          16) symmetry_worst< -1.733593 115  10 M (0.91304348 0.08695652)  
##            32) texture_worst>=4.897936 109   5 M (0.95412844 0.04587156)  
##              64) compactness_se>=-4.706178 106   2 M (0.98113208 0.01886792) *
##              65) compactness_se< -4.706178 3   0 B (0.00000000 1.00000000) *
##            33) texture_worst< 4.897936 6   1 B (0.16666667 0.83333333)  
##              66) texture_mean< 3.156152 1   0 M (1.00000000 0.00000000) *
##              67) texture_mean>=3.156152 5   0 B (0.00000000 1.00000000) *
##          17) symmetry_worst>=-1.733593 121  43 M (0.64462810 0.35537190)  
##            34) symmetry_worst>=-1.703871 98  25 M (0.74489796 0.25510204)  
##              68) compactness_se>=-4.418276 73  10 M (0.86301370 0.13698630) *
##              69) compactness_se< -4.418276 25  10 B (0.40000000 0.60000000) *
##            35) symmetry_worst< -1.703871 23   5 B (0.21739130 0.78260870)  
##              70) texture_mean< 2.982883 5   0 M (1.00000000 0.00000000) *
##              71) texture_mean>=2.982883 18   0 B (0.00000000 1.00000000) *
##         9) symmetry_worst< -2.207988 24   6 B (0.25000000 0.75000000)  
##          18) compactness_se>=-3.413706 6   0 M (1.00000000 0.00000000) *
##          19) compactness_se< -3.413706 18   0 B (0.00000000 1.00000000) *
##       5) smoothness_worst< -1.623453 14   1 B (0.07142857 0.92857143)  
##        10) smoothness_mean>=-2.396135 1   0 M (1.00000000 0.00000000) *
##        11) smoothness_mean< -2.396135 13   0 B (0.00000000 1.00000000) *
##     3) texture_worst< 4.824912 638 310 B (0.48589342 0.51410658)  
##       6) texture_worst< 4.747573 576 276 M (0.52083333 0.47916667)  
##        12) texture_mean>=3.055881 72   8 M (0.88888889 0.11111111)  
##          24) smoothness_worst>=-1.606352 52   0 M (1.00000000 0.00000000) *
##          25) smoothness_worst< -1.606352 20   8 M (0.60000000 0.40000000)  
##            50) smoothness_worst< -1.678162 13   1 M (0.92307692 0.07692308)  
##             100) texture_worst< 4.575818 12   0 M (1.00000000 0.00000000) *
##             101) texture_worst>=4.575818 1   0 B (0.00000000 1.00000000) *
##            51) smoothness_worst>=-1.678162 7   0 B (0.00000000 1.00000000) *
##        13) texture_mean< 3.055881 504 236 B (0.46825397 0.53174603)  
##          26) compactness_se< -3.492332 348 155 M (0.55459770 0.44540230)  
##            52) symmetry_worst< -1.503393 312 122 M (0.60897436 0.39102564)  
##             104) smoothness_worst>=-1.451352 56   7 M (0.87500000 0.12500000) *
##             105) smoothness_worst< -1.451352 256 115 M (0.55078125 0.44921875) *
##            53) symmetry_worst>=-1.503393 36   3 B (0.08333333 0.91666667)  
##             106) texture_mean>=2.998678 1   0 M (1.00000000 0.00000000) *
##             107) texture_mean< 2.998678 35   2 B (0.05714286 0.94285714) *
##          27) compactness_se>=-3.492332 156  43 B (0.27564103 0.72435897)  
##            54) symmetry_worst>=-1.327359 20   6 M (0.70000000 0.30000000)  
##             108) compactness_se< -2.646661 13   0 M (1.00000000 0.00000000) *
##             109) compactness_se>=-2.646661 7   1 B (0.14285714 0.85714286) *
##            55) symmetry_worst< -1.327359 136  29 B (0.21323529 0.78676471)  
##             110) texture_worst< 4.530419 84  25 B (0.29761905 0.70238095) *
##             111) texture_worst>=4.530419 52   4 B (0.07692308 0.92307692) *
##       7) texture_worst>=4.747573 62  10 B (0.16129032 0.83870968)  
##        14) symmetry_worst>=-1.758895 18   8 B (0.44444444 0.55555556)  
##          28) smoothness_mean< -2.321477 10   2 M (0.80000000 0.20000000)  
##            56) texture_mean< 3.091538 8   0 M (1.00000000 0.00000000) *
##            57) texture_mean>=3.091538 2   0 B (0.00000000 1.00000000) *
##          29) smoothness_mean>=-2.321477 8   0 B (0.00000000 1.00000000) *
##        15) symmetry_worst< -1.758895 44   2 B (0.04545455 0.95454545)  
##          30) compactness_se>=-2.72933 2   0 M (1.00000000 0.00000000) *
##          31) compactness_se< -2.72933 42   0 B (0.00000000 1.00000000) *
## 
## 
## $weights
##   [1] 1.0143223 0.9756740 0.6544576 0.7289082 0.4903051 0.5550935 0.2876624
##   [8] 0.4064599 0.5233357 0.5756598 0.6614213 0.7341538 0.8647840 0.6804425
##  [15] 0.2325935 0.5212432 0.5775010 0.5009532 0.7629577 0.5141035 0.4451419
##  [22] 0.4937205 0.2743158 0.4192772 0.8325064 0.5171780 0.5715872 0.6065961
##  [29] 0.5535083 0.5271030 0.3934626 0.3765380 0.4683799 0.7562199 0.6684508
##  [36] 0.5028706 0.5769558 0.5797812 0.4745713 0.4430292 0.5735213 0.3458962
##  [43] 0.3509300 0.5811027 0.6716382 0.8078402 0.5685447 0.5013177 0.8080208
##  [50] 0.6515440 0.6881326 0.6901804 0.5601635 0.5244748 1.0363324 0.7371558
##  [57] 0.3828816 0.6145244 0.6134399 0.4442746 0.8372090 0.6366471 0.7484774
##  [64] 0.6330865 0.4469461 0.5757497 0.5598130 0.8043061 0.6113138 0.5318741
##  [71] 0.6733939 0.4553865 0.6919234 0.5780642 0.3087808 0.3121142 0.4495589
##  [78] 0.5507634 0.3404249 0.5228403 0.4359747 0.5776337 0.5951755 0.5141975
##  [85] 0.4873377 0.6716764 0.6972312 0.5777235 0.4797364 0.4103516 0.3885086
##  [92] 0.7396938 0.5585543 0.8443694 0.5253073 0.7530113 0.5281069 0.4253419
##  [99] 0.6417699 0.5549526
## 
## $votes
##             [,1]      [,2]
##   [1,] 42.329044 15.241425
##   [2,] 39.490181 18.080288
##   [3,] 45.031439 12.539030
##   [4,] 42.252155 15.318315
##   [5,] 40.758192 16.812278
##   [6,] 41.813274 15.757196
##   [7,] 41.595879 15.974590
##   [8,] 44.876683 12.693787
##   [9,] 49.393533  8.176937
##  [10,] 39.165566 18.404903
##  [11,] 43.046311 14.524159
##  [12,] 39.041051 18.529418
##  [13,] 47.917722  9.652747
##  [14,] 40.380235 17.190235
##  [15,] 47.007937 10.562532
##  [16,] 15.015397 42.555072
##  [17,] 11.843963 45.726507
##  [18,]  8.933814 48.636655
##  [19,] 40.432408 17.138061
##  [20,] 47.539462 10.031007
##  [21,] 42.225845 15.344624
##  [22,] 46.100072 11.470397
##  [23,] 40.375669 17.194801
##  [24,] 46.191932 11.378537
##  [25,] 40.029218 17.541252
##  [26,] 46.495445 11.075024
##  [27,] 44.552073 13.018396
##  [28,] 40.999030 16.571440
##  [29,] 42.426265 15.144204
##  [30,] 41.714131 15.856338
##  [31,] 16.493060 41.077409
##  [32,] 40.690544 16.879925
##  [33,] 41.080104 16.490365
##  [34,] 39.939305 17.631164
##  [35,] 42.745898 14.824571
##  [36,] 44.053902 13.516567
##  [37,] 15.344474 42.225995
##  [38,] 15.177957 42.392512
##  [39,] 17.549748 40.020721
##  [40,] 15.476259 42.094210
##  [41,] 10.545578 47.024891
##  [42,] 14.244404 43.326065
##  [43,] 44.315197 13.255272
##  [44,] 15.426712 42.143757
##  [45,] 42.169877 15.400592
##  [46,] 46.276418 11.294052
##  [47,] 11.109954 46.460516
##  [48,] 15.288239 42.282230
##  [49,] 17.402916 40.167553
##  [50,] 40.908666 16.661803
##  [51,] 10.690868 46.879601
##  [52,] 46.689719 10.880750
##  [53,] 15.710229 41.860240
##  [54,] 14.430000 43.140470
##  [55,] 44.506806 13.063663
##  [56,] 40.078074 17.492395
##  [57,] 15.594958 41.975511
##  [58,] 41.099115 16.471354
##  [59,] 17.651822 39.918647
##  [60,] 40.706496 16.863973
##  [61,] 43.487699 14.082770
##  [62,] 17.649931 39.920538
##  [63,] 15.889126 41.681343
##  [64,] 44.848782 12.721687
##  [65,] 43.443864 14.126605
##  [66,] 16.531876 41.038593
##  [67,] 40.071184 17.499285
##  [68,] 40.415264 17.155205
##  [69,] 41.420168 16.150301
##  [70,] 16.026344 41.544125
##  [71,] 15.487392 42.083078
##  [72,] 42.539526 15.030944
##  [73,] 17.333677 40.236792
##  [74,] 41.764358 15.806111
##  [75,] 41.330089 16.240380
##  [76,] 15.713683 41.856786
##  [77,] 15.523180 42.047289
##  [78,] 40.750992 16.819477
##  [79,] 41.158934 16.411536
##  [80,]  7.723798 49.846671
##  [81,] 17.844459 39.726010
##  [82,] 41.467870 16.102599
##  [83,] 15.628771 41.941699
##  [84,] 16.674289 40.896180
##  [85,] 17.835032 39.735437
##  [86,] 15.581389 41.989081
##  [87,] 16.930429 40.640040
##  [88,] 15.647728 41.922741
##  [89,] 15.413664 42.156805
##  [90,] 15.929812 41.640658
##  [91,] 43.611474 13.958995
##  [92,] 46.226228 11.344241
##  [93,] 41.569540 16.000929
##  [94,] 16.237257 41.333213
##  [95,] 39.627016 17.943453
##  [96,]  9.645731 47.924738
##  [97,] 15.587883 41.982586
##  [98,] 40.970599 16.599870
##  [99,] 40.768091 16.802378
## [100,] 16.267338 41.303131
## [101,] 39.061435 18.509034
## [102,] 42.326898 15.243571
## [103,] 16.546250 41.024219
## [104,] 41.636422 15.934047
## [105,] 13.925605 43.644865
## [106,] 14.259043 43.311426
## [107,] 15.745004 41.825465
## [108,]  7.481203 50.089266
## [109,] 39.615987 17.954482
## [110,] 15.931767 41.638703
## [111,] 13.471828 44.098642
## [112,]  8.316276 49.254193
## [113,] 15.865027 41.705443
## [114,] 16.065740 41.504730
## [115,] 15.030440 42.540029
## [116,] 17.436322 40.134148
## [117,] 13.445806 44.124663
## [118,] 10.846829 46.723640
## [119,] 16.188481 41.381989
## [120,] 15.056254 42.514216
## [121,] 40.135844 17.434625
## [122,] 15.778244 41.792225
## [123,] 11.691783 45.878686
## [124,] 11.304755 46.265714
## [125,] 17.513021 40.057449
## [126,] 41.178354 16.392116
## [127,] 17.777314 39.793155
## [128,] 39.085793 18.484676
## [129,] 15.696046 41.874424
## [130,] 39.619420 17.951049
## [131,] 41.692133 15.878336
## [132,] 15.694279 41.876190
## [133,] 11.553010 46.017459
## [134,] 40.291941 17.278529
## [135,] 40.846776 16.723693
## [136,] 16.241672 41.328797
## [137,] 11.874961 45.695508
## [138,] 10.162298 47.408171
## [139,] 12.883071 44.687398
## [140,] 40.158291 17.412178
## [141,] 13.170323 44.400146
## [142,]  9.398348 48.172121
## [143,] 41.918958 15.651511
## [144,] 48.498962  9.071507
## [145,] 41.596852 15.973617
## [146,] 13.631057 43.939412
## [147,] 16.591537 40.978932
## [148,] 39.118530 18.451940
## [149,] 15.304620 42.265849
## [150,] 16.517481 41.052988
## [151,] 12.853754 44.716716
## [152,] 14.537032 43.033438
## [153,] 13.569387 44.001082
## [154,] 42.492009 15.078460
## [155,] 18.273886 39.296583
## [156,] 40.764815 16.805654
## [157,] 39.471607 18.098862
## [158,] 40.307098 17.263371
## [159,] 46.929063 10.641406
## [160,] 45.253245 12.317225
## [161,] 17.670964 39.899505
## [162,] 39.327920 18.242550
## [163,] 13.894557 43.675912
## [164,] 42.015716 15.554753
## [165,] 10.889849 46.680620
## [166,] 17.230468 40.340001
## [167,] 41.919234 15.651235
## [168,] 44.414150 13.156319
## [169,] 42.626195 14.944274
## [170,] 13.309058 44.261412
## [171,] 40.906037 16.664432
## [172,] 40.787105 16.783365
## [173,] 45.553639 12.016831
## [174,] 15.962254 41.608215
## [175,]  9.320753 48.249716
## [176,]  6.324357 51.246112
## [177,] 16.581498 40.988971
## [178,] 43.439078 14.131392
## [179,] 41.460829 16.109640
## [180,] 15.100720 42.469749
## [181,] 15.171150 42.399319
## [182,] 39.542239 18.028230
## [183,] 16.644397 40.926072
## [184,] 16.568362 41.002107
## [185,] 41.437918 16.132551
## [186,] 40.570763 16.999706
## [187,] 16.519360 41.051109
## [188,] 16.939696 40.630774
## [189,] 16.333332 41.237138
## [190,] 42.417725 15.152745
## [191,] 17.007668 40.562801
## [192,] 13.099085 44.471384
## [193,] 18.565165 39.005304
## [194,] 15.572239 41.998230
## [195,] 41.081635 16.488834
## [196,] 17.488987 40.081482
## [197,] 41.796956 15.773513
## [198,] 39.108939 18.461531
## [199,] 40.442670 17.127799
## [200,] 41.504612 16.065858
## [201,] 41.419845 16.150624
## [202,] 46.580196 10.990273
## [203,] 39.690359 17.880110
## [204,] 39.642826 17.927643
## [205,] 40.089681 17.480788
## [206,] 15.376680 42.193789
## [207,] 12.812442 44.758027
## [208,] 15.239602 42.330868
## [209,] 17.225327 40.345142
## [210,] 10.133132 47.437337
## [211,] 41.901471 15.668999
## [212,] 11.954585 45.615884
## [213,] 14.639397 42.931072
## [214,] 39.630706 17.939763
## [215,] 10.985589 46.584880
## [216,] 12.121526 45.448943
## [217,] 45.455539 12.114930
## [218,] 14.045496 43.524973
## [219,] 41.545636 16.024834
## [220,] 39.745570 17.824899
## [221,] 10.946441 46.624028
## [222,] 15.525196 42.045273
## [223,]  7.163062 50.407408
## [224,] 15.934043 41.636426
## [225,] 16.382518 41.187951
## [226,] 13.627924 43.942545
## [227,] 17.151821 40.418649
## [228,] 16.684336 40.886133
## [229,] 14.491262 43.079207
## [230,] 10.642912 46.927557
## [231,]  8.399496 49.170973
## [232,]  8.653451 48.917019
## [233,] 39.983882 17.586587
## [234,] 13.839410 43.731059
## [235,] 40.420370 17.150100
## [236,] 15.241563 42.328906
## [237,] 41.365894 16.204575
## [238,] 15.072511 42.497959
## [239,] 15.683771 41.886698
## [240,] 15.443691 42.126778
## [241,] 10.382576 47.187893
## [242,]  7.775196 49.795273
## [243,]  7.157262 50.413207
## [244,] 16.393828 41.176642
## [245,]  6.821368 50.749101
## [246,] 16.496084 41.074385
## [247,] 14.473565 43.096904
## [248,] 10.112265 47.458204
## [249,] 11.940368 45.630102
## [250,]  6.291508 51.278961
## [251,] 40.946731 16.623738
## [252,] 16.017386 41.553083
## [253,] 12.399457 45.171012
## [254,] 14.558355 43.012114
## [255,] 40.030876 17.539593
## [256,] 13.413594 44.156876
## [257,] 48.632979  8.937490
## [258,] 15.171194 42.399275
## [259,] 13.598818 43.971651
## [260,]  9.126510 48.443960
## [261,] 40.970157 16.600312
## [262,] 42.510813 15.059656
## [263,] 39.658379 17.912090
## [264,] 17.287970 40.282499
## [265,] 16.317325 41.253144
## [266,] 10.559911 47.010558
## [267,] 12.582730 44.987739
## [268,] 40.280639 17.289830
## [269,] 16.797394 40.773075
## [270,] 41.943917 15.626552
## [271,] 16.409448 41.161021
## [272,] 41.113161 16.457308
## [273,] 16.095134 41.475335
## [274,] 15.016418 42.554051
## [275,] 16.302920 41.267549
## [276,] 41.844040 15.726429
## [277,] 13.991506 43.578964
## [278,] 16.388889 41.181580
## [279,] 11.604564 45.965906
## [280,] 16.500305 41.070164
## [281,] 11.970377 45.600093
## [282,] 44.484780 13.085690
## [283,] 42.988808 14.581661
## [284,] 41.007322 16.563147
## [285,] 13.092086 44.478384
## [286,] 17.216979 40.353490
## [287,]  6.958250 50.612219
## [288,] 16.359595 41.210875
## [289,] 16.103215 41.467254
## [290,] 10.366227 47.204242
## [291,] 15.331872 42.238597
## [292,] 17.619056 39.951413
## [293,] 40.863937 16.706532
## [294,] 43.873651 13.696819
## [295,] 40.847328 16.723141
## [296,] 40.767173 16.803297
## [297,] 46.609192 10.961278
## [298,] 11.108456 46.462014
## [299,] 40.092075 17.478394
## [300,] 40.468933 17.101537
## [301,] 15.927028 41.643441
## [302,] 13.194750 44.375719
## [303,] 15.644265 41.926204
## [304,] 14.827984 42.742485
## [305,] 13.780428 43.790041
## [306,] 48.692716  8.877753
## [307,] 15.537938 42.032531
## [308,] 17.519372 40.051097
## [309,] 14.479401 43.091068
## [310,] 40.006217 17.564252
## [311,] 13.735708 43.834761
## [312,] 41.678969 15.891500
## [313,] 10.508065 47.062404
## [314,] 16.516279 41.054191
## [315,] 42.340078 15.230392
## [316,] 46.045008 11.525462
## [317,] 17.768793 39.801676
## [318,] 12.280889 45.289580
## [319,] 14.388007 43.182462
## [320,] 11.051574 46.518896
## [321,] 11.805312 45.765157
## [322,] 18.261665 39.308804
## [323,] 14.563284 43.007185
## [324,] 12.818275 44.752194
## [325,]  7.080648 50.489821
## [326,] 16.874479 40.695990
## [327,] 14.434649 43.135820
## [328,] 15.461922 42.108547
## [329,] 43.126498 14.443971
## [330,] 15.720124 41.850345
## [331,] 16.632744 40.937726
## [332,] 16.125408 41.445061
## [333,] 13.679993 43.890476
## [334,] 16.623213 40.947256
## [335,] 39.828878 17.741592
## [336,] 18.353931 39.216538
## [337,] 41.900218 15.670252
## [338,] 13.405963 44.164506
## [339,] 16.637079 40.933390
## [340,] 15.389878 42.180591
## [341,] 16.502112 41.068357
## [342,] 18.193787 39.376682
## [343,] 16.305862 41.264607
## [344,] 13.452438 44.118032
## [345,] 15.366334 42.204135
## [346,]  6.399325 51.171144
## [347,] 11.830629 45.739841
## [348,] 40.884001 16.686468
## [349,] 16.932959 40.637510
## [350,] 45.189026 12.381443
## [351,] 16.024377 41.546092
## [352,] 41.435068 16.135401
## [353,] 16.948472 40.621998
## [354,] 15.730409 41.840060
## [355,] 14.845214 42.725255
## [356,] 11.166136 46.404333
## [357,] 18.038283 39.532187
## [358,] 12.529233 45.041236
## [359,] 18.073967 39.496502
## [360,] 39.802093 17.768376
## [361,] 13.463067 44.107402
## [362,] 11.860380 45.710089
## [363,] 39.423343 18.147127
## [364,] 17.141117 40.429352
## [365,] 40.789358 16.781111
## [366,] 17.613195 39.957275
## [367,] 17.183499 40.386970
## [368,] 18.047056 39.523413
## [369,] 15.492260 42.078210
## [370,] 15.925533 41.644936
## [371,] 43.154194 14.416275
## [372,] 16.096041 41.474429
## [373,] 14.328149 43.242320
## [374,] 15.943344 41.627125
## [375,] 18.420080 39.150390
## [376,] 16.695493 40.874976
## [377,] 41.226719 16.343751
## [378,] 16.714149 40.856320
## [379,] 15.985564 41.584905
## [380,] 12.589432 44.981037
## [381,]  7.800596 49.769874
## [382,] 16.952283 40.618186
## [383,] 13.668818 43.901651
## [384,] 17.667554 39.902915
## [385,] 14.094885 43.475584
## [386,] 14.133844 43.436626
## [387,] 40.301270 17.269199
## [388,] 15.368830 42.201639
## [389,] 15.785327 41.785142
## [390,]  8.854302 48.716167
## [391,] 15.752889 41.817580
## [392,] 10.548220 47.022249
## [393,] 14.368208 43.202261
## [394,] 46.342119 11.228350
## [395,] 42.000720 15.569750
## [396,] 16.983990 40.586479
## [397,]  8.264940 49.305529
## [398,] 13.691033 43.879436
## [399,] 15.718955 41.851514
## [400,] 39.594604 17.975865
## [401,] 41.758494 15.811975
## [402,] 16.572076 40.998393
## [403,] 39.822562 17.747907
## [404,] 16.322066 41.248403
## [405,] 15.027980 42.542490
## [406,] 16.604172 40.966297
## [407,] 43.721618 13.848851
## [408,]  8.997124 48.573345
## [409,] 12.453205 45.117264
## [410,] 45.263275 12.307195
## [411,] 14.433005 43.137464
## [412,] 40.072182 17.498288
## [413,] 15.927920 41.642549
## [414,] 41.736655 15.833815
## [415,] 41.480181 16.090288
## [416,] 15.844739 41.725730
## [417,] 16.785236 40.785233
## [418,] 17.494677 40.075792
## [419,] 40.065990 17.504479
## [420,] 14.849806 42.720663
## [421,] 17.065346 40.505123
## [422,] 10.749741 46.820728
## [423,] 16.740139 40.830330
## [424,]  8.112941 49.457529
## [425,] 14.295648 43.274821
## [426,] 15.223388 42.347081
## [427,] 40.120352 17.450117
## [428,] 16.907704 40.662766
## [429,] 15.586970 41.983499
## [430,] 16.653887 40.916583
## [431,] 14.215714 43.354755
## [432,] 18.114428 39.456042
## [433,] 16.348660 41.221809
## [434,] 15.930001 41.640468
## [435,] 13.531725 44.038744
## [436,] 15.619177 41.951293
## [437,] 16.673553 40.896916
## [438,] 16.390306 41.180164
## [439,] 15.724532 41.845937
## [440,] 14.799221 42.771248
## [441,] 13.852387 43.718082
## [442,] 16.834420 40.736049
## [443,] 16.666325 40.904144
## [444,] 17.289087 40.281382
## [445,] 13.422934 44.147536
## [446,] 49.214369  8.356100
## [447,] 42.330540 15.239929
## [448,] 41.158148 16.412322
## [449,] 40.125715 17.444754
## [450,] 39.913106 17.657363
## [451,] 47.532105 10.038364
## [452,] 16.118850 41.451619
## [453,] 42.329044 15.241425
## [454,] 39.490181 18.080288
## [455,] 45.031439 12.539030
## [456,] 42.252155 15.318315
## [457,] 40.758192 16.812278
## [458,] 41.595879 15.974590
## [459,] 41.057490 16.512979
## [460,] 49.393533  8.176937
## [461,] 39.165566 18.404903
## [462,] 40.339203 17.231267
## [463,] 47.917722  9.652747
## [464,] 50.001039  7.569430
## [465,] 40.380235 17.190235
## [466,] 47.007937 10.562532
## [467,] 41.793224 15.777245
## [468,]  8.933814 48.636655
## [469,] 40.432408 17.138061
## [470,] 43.805650 13.764819
## [471,] 47.539462 10.031007
## [472,] 42.225845 15.344624
## [473,] 46.100072 11.470397
## [474,] 40.375669 17.194801
## [475,] 46.191932 11.378537
## [476,] 40.029218 17.541252
## [477,] 46.434835 11.135634
## [478,] 44.552073 13.018396
## [479,] 42.426265 15.144204
## [480,] 41.714131 15.856338
## [481,] 41.776506 15.793963
## [482,] 16.493060 41.077409
## [483,] 40.690544 16.879925
## [484,] 41.080104 16.490365
## [485,] 39.939305 17.631164
## [486,] 42.745898 14.824571
## [487,] 44.878696 12.691773
## [488,] 44.053902 13.516567
## [489,] 41.431706 16.138764
## [490,] 15.344474 42.225995
## [491,] 15.177957 42.392512
## [492,] 15.476259 42.094210
## [493,] 10.545578 47.024891
## [494,] 14.244404 43.326065
## [495,] 38.799319 18.771150
## [496,] 44.315197 13.255272
## [497,] 42.169877 15.400592
## [498,] 46.276418 11.294052
## [499,] 15.288239 42.282230
## [500,]  7.554295 50.016174
## [501,] 17.402916 40.167553
## [502,] 10.690868 46.879601
## [503,] 48.224720  9.345749
## [504,] 46.689719 10.880750
## [505,] 15.710229 41.860240
## [506,] 14.430000 43.140470
## [507,] 18.491935 39.078534
## [508,] 11.854703 45.715766
## [509,] 39.312619 18.257850
## [510,] 15.527199 42.043270
## [511,] 44.506806 13.063663
## [512,] 40.078074 17.492395
## [513,] 15.594958 41.975511
## [514,] 41.099115 16.471354
## [515,] 40.706496 16.863973
## [516,] 43.487699 14.082770
## [517,] 15.889126 41.681343
## [518,] 18.642510 38.927959
## [519,] 44.848782 12.721687
## [520,] 43.443864 14.126605
## [521,] 16.531876 41.038593
## [522,] 40.071184 17.499285
## [523,] 40.415264 17.155205
## [524,] 18.633537 38.936933
## [525,] 16.026344 41.544125
## [526,] 15.487392 42.083078
## [527,] 42.539526 15.030944
## [528,] 17.333677 40.236792
## [529,] 41.764358 15.806111
## [530,] 15.713683 41.856786
## [531,] 17.135149 40.435321
## [532,] 15.523180 42.047289
## [533,] 40.750992 16.819477
## [534,] 41.158934 16.411536
## [535,]  7.723798 49.846671
## [536,] 16.389681 41.180788
## [537,] 17.834459 39.736010
## [538,] 17.844459 39.726010
## [539,] 41.467870 16.102599
## [540,] 17.882210 39.688259
## [541,] 41.881782 15.688687
## [542,] 17.835032 39.735437
## [543,] 15.581389 41.989081
## [544,] 16.930429 40.640040
## [545,] 15.647728 41.922741
## [546,] 15.413664 42.156805
## [547,] 15.929812 41.640658
## [548,] 46.226228 11.344241
## [549,] 16.237257 41.333213
## [550,] 39.627016 17.943453
## [551,] 42.366252 15.204217
## [552,]  9.645731 47.924738
## [553,] 13.445085 44.125384
## [554,] 40.970599 16.599870
## [555,] 40.768091 16.802378
## [556,] 15.091583 42.478886
## [557,] 40.969432 16.601037
## [558,] 16.267338 41.303131
## [559,] 42.326898 15.243571
## [560,] 16.546250 41.024219
## [561,] 43.045471 14.524998
## [562,] 14.259043 43.311426
## [563,] 40.292532 17.277937
## [564,]  7.481203 50.089266
## [565,] 39.615987 17.954482
## [566,] 15.931767 41.638703
## [567,] 13.471828 44.098642
## [568,]  8.316276 49.254193
## [569,] 42.924843 14.645626
## [570,] 16.065740 41.504730
## [571,] 17.156713 40.413756
## [572,] 10.631772 46.938697
## [573,] 15.030440 42.540029
## [574,] 17.436322 40.134148
## [575,] 13.445806 44.124663
## [576,] 10.846829 46.723640
## [577,] 16.188481 41.381989
## [578,] 15.056254 42.514216
## [579,] 15.778244 41.792225
## [580,] 11.691783 45.878686
## [581,] 11.304755 46.265714
## [582,] 17.513021 40.057449
## [583,] 39.496083 18.074386
## [584,] 41.178354 16.392116
## [585,] 17.777314 39.793155
## [586,] 15.696046 41.874424
## [587,] 12.923496 44.646973
## [588,] 39.619420 17.951049
## [589,] 41.692133 15.878336
## [590,] 15.694279 41.876190
## [591,] 11.553010 46.017459
## [592,] 40.291941 17.278529
## [593,] 40.846776 16.723693
## [594,] 16.241672 41.328797
## [595,] 11.874961 45.695508
## [596,] 10.162298 47.408171
## [597,] 12.883071 44.687398
## [598,] 13.170323 44.400146
## [599,]  9.398348 48.172121
## [600,] 41.918958 15.651511
## [601,] 48.498962  9.071507
## [602,] 13.631057 43.939412
## [603,] 40.055309 17.515161
## [604,] 16.591537 40.978932
## [605,] 39.118530 18.451940
## [606,] 15.304620 42.265849
## [607,] 16.517481 41.052988
## [608,] 12.853754 44.716716
## [609,] 45.137977 12.432492
## [610,] 14.537032 43.033438
## [611,] 13.569387 44.001082
## [612,] 42.492009 15.078460
## [613,] 42.629392 14.941077
## [614,] 40.764815 16.805654
## [615,] 39.471607 18.098862
## [616,] 42.049867 15.520603
## [617,] 17.585651 39.984818
## [618,] 40.307098 17.263371
## [619,] 46.929063 10.641406
## [620,] 45.253245 12.317225
## [621,] 17.670964 39.899505
## [622,] 13.894557 43.675912
## [623,] 42.015716 15.554753
## [624,] 18.124199 39.446270
## [625,] 10.889849 46.680620
## [626,] 17.230468 40.340001
## [627,] 40.340613 17.229856
## [628,] 41.919234 15.651235
## [629,] 44.414150 13.156319
## [630,] 42.626195 14.944274
## [631,] 13.309058 44.261412
## [632,] 40.906037 16.664432
## [633,] 40.787105 16.783365
## [634,] 10.658379 46.912090
## [635,] 17.446338 40.124132
## [636,] 45.553639 12.016831
## [637,] 15.962254 41.608215
## [638,]  9.320753 48.249716
## [639,] 16.248105 41.322364
## [640,] 16.581498 40.988971
## [641,] 43.439078 14.131392
## [642,] 41.460829 16.109640
## [643,] 15.100720 42.469749
## [644,] 15.171150 42.399319
## [645,] 39.542239 18.028230
## [646,] 16.644397 40.926072
## [647,] 16.568362 41.002107
## [648,] 41.409375 16.161094
## [649,] 41.437918 16.132551
## [650,] 14.571966 42.998503
## [651,] 16.519360 41.051109
## [652,] 10.381244 47.189226
## [653,] 16.939696 40.630774
## [654,] 42.417725 15.152745
## [655,] 16.485389 41.085080
## [656,] 13.099085 44.471384
## [657,] 15.572239 41.998230
## [658,] 17.488987 40.081482
## [659,] 41.796956 15.773513
## [660,] 40.442670 17.127799
## [661,] 41.359863 16.210606
## [662,] 41.504612 16.065858
## [663,] 41.419845 16.150624
## [664,] 45.865309 11.705160
## [665,] 46.580196 10.990273
## [666,] 39.690359 17.880110
## [667,] 39.373719 18.196751
## [668,] 40.435716 17.134753
## [669,] 39.642826 17.927643
## [670,] 40.089681 17.480788
## [671,] 39.871114 17.699355
## [672,] 15.376680 42.193789
## [673,] 12.812442 44.758027
## [674,] 15.239602 42.330868
## [675,] 17.225327 40.345142
## [676,] 10.133132 47.437337
## [677,] 16.477544 41.092925
## [678,] 41.901471 15.668999
## [679,] 15.534475 42.035994
## [680,] 40.119533 17.450936
## [681,] 11.954585 45.615884
## [682,] 14.639397 42.931072
## [683,] 39.630706 17.939763
## [684,] 10.985589 46.584880
## [685,] 12.121526 45.448943
## [686,] 14.045496 43.524973
## [687,] 41.545636 16.024834
## [688,] 39.745570 17.824899
## [689,] 10.946441 46.624028
## [690,] 15.525196 42.045273
## [691,]  7.163062 50.407408
## [692,] 15.934043 41.636426
## [693,] 16.382518 41.187951
## [694,] 13.627924 43.942545
## [695,] 17.151821 40.418649
## [696,] 16.684336 40.886133
## [697,] 14.491262 43.079207
## [698,] 10.642912 46.927557
## [699,]  8.653451 48.917019
## [700,] 13.839410 43.731059
## [701,] 16.046937 41.523532
## [702,] 15.241563 42.328906
## [703,] 41.365894 16.204575
## [704,] 15.072511 42.497959
## [705,] 15.683771 41.886698
## [706,] 15.443691 42.126778
## [707,] 10.751721 46.818748
## [708,]  7.775196 49.795273
## [709,]  7.157262 50.413207
## [710,] 16.393828 41.176642
## [711,]  6.821368 50.749101
## [712,] 14.473565 43.096904
## [713,] 10.112265 47.458204
## [714,] 11.940368 45.630102
## [715,] 40.946731 16.623738
## [716,] 16.017386 41.553083
## [717,] 12.399457 45.171012
## [718,] 14.558355 43.012114
## [719,] 40.030876 17.539593
## [720,] 13.413594 44.156876
## [721,] 48.632979  8.937490
## [722,] 15.171194 42.399275
## [723,] 10.071307 47.499162
## [724,] 40.970157 16.600312
## [725,] 42.510813 15.059656
## [726,] 39.658379 17.912090
## [727,] 17.287970 40.282499
## [728,] 16.317325 41.253144
## [729,] 10.559911 47.010558
## [730,] 12.582730 44.987739
## [731,] 41.943917 15.626552
## [732,] 16.409448 41.161021
## [733,] 41.113161 16.457308
## [734,] 16.095134 41.475335
## [735,] 15.016418 42.554051
## [736,] 16.302920 41.267549
## [737,]  9.252615 48.317854
## [738,] 13.991506 43.578964
## [739,] 16.388889 41.181580
## [740,] 13.425525 44.144944
## [741,] 11.970377 45.600093
## [742,] 44.484780 13.085690
## [743,] 13.092086 44.478384
## [744,] 17.216979 40.353490
## [745,]  6.958250 50.612219
## [746,] 16.103215 41.467254
## [747,] 10.366227 47.204242
## [748,] 15.331872 42.238597
## [749,] 17.619056 39.951413
## [750,] 17.083798 40.486671
## [751,] 40.863937 16.706532
## [752,] 17.718919 39.851550
## [753,] 40.847328 16.723141
## [754,] 40.767173 16.803297
## [755,] 46.609192 10.961278
## [756,] 40.092075 17.478394
## [757,] 40.468933 17.101537
## [758,] 15.927028 41.643441
## [759,] 13.194750 44.375719
## [760,] 15.644265 41.926204
## [761,] 14.827984 42.742485
## [762,] 13.780428 43.790041
## [763,] 48.692716  8.877753
## [764,] 12.853359 44.717110
## [765,] 17.070884 40.499585
## [766,] 14.479401 43.091068
## [767,] 40.006217 17.564252
## [768,] 10.294627 47.275842
## [769,] 13.735708 43.834761
## [770,] 41.678969 15.891500
## [771,] 10.508065 47.062404
## [772,] 16.516279 41.054191
## [773,] 42.340078 15.230392
## [774,] 46.045008 11.525462
## [775,] 17.768793 39.801676
## [776,] 12.280889 45.289580
## [777,] 14.388007 43.182462
## [778,] 11.805312 45.765157
## [779,] 18.261665 39.308804
## [780,] 41.854440 15.716029
## [781,] 14.563284 43.007185
## [782,] 13.342936 44.227534
## [783,] 16.874479 40.695990
## [784,] 15.461922 42.108547
## [785,] 43.126498 14.443971
## [786,] 15.720124 41.850345
## [787,] 16.125408 41.445061
## [788,] 13.679993 43.890476
## [789,] 16.623213 40.947256
## [790,] 39.828878 17.741592
## [791,] 17.358376 40.212093
## [792,] 13.405963 44.164506
## [793,] 15.687104 41.883365
## [794,] 16.637079 40.933390
## [795,] 15.389878 42.180591
## [796,] 16.502112 41.068357
## [797,] 16.305862 41.264607
## [798,] 13.452438 44.118032
## [799,] 15.366334 42.204135
## [800,] 17.151604 40.418865
## [801,]  6.399325 51.171144
## [802,] 11.830629 45.739841
## [803,] 40.884001 16.686468
## [804,] 16.932959 40.637510
## [805,] 45.189026 12.381443
## [806,] 45.402719 12.167750
## [807,] 16.024377 41.546092
## [808,] 41.435068 16.135401
## [809,] 16.948472 40.621998
## [810,] 15.730409 41.840060
## [811,] 14.845214 42.725255
## [812,] 11.166136 46.404333
## [813,] 41.544879 16.025590
## [814,] 12.529233 45.041236
## [815,] 17.258650 40.311819
## [816,] 38.772826 18.797643
## [817,] 39.802093 17.768376
## [818,] 13.463067 44.107402
## [819,] 11.860380 45.710089
## [820,] 39.423343 18.147127
## [821,] 17.141117 40.429352
## [822,] 40.789358 16.781111
## [823,] 17.613195 39.957275
## [824,] 12.639034 44.931435
## [825,] 18.047056 39.523413
## [826,] 15.492260 42.078210
## [827,] 15.925533 41.644936
## [828,] 12.695213 44.875256
## [829,] 42.214018 15.356451
## [830,] 11.499153 46.071316
## [831,] 16.096041 41.474429
## [832,] 15.943344 41.627125
## [833,] 18.420080 39.150390
## [834,] 16.695493 40.874976
## [835,] 41.226719 16.343751
## [836,] 16.714149 40.856320
## [837,] 15.985564 41.584905
## [838,] 14.683016 42.887453
## [839,] 12.589432 44.981037
## [840,] 16.952283 40.618186
## [841,] 14.133844 43.436626
## [842,] 40.301270 17.269199
## [843,] 15.368830 42.201639
## [844,] 15.785327 41.785142
## [845,] 16.975047 40.595422
## [846,] 15.752889 41.817580
## [847,] 10.548220 47.022249
## [848,] 14.368208 43.202261
## [849,] 46.342119 11.228350
## [850,] 11.527286 46.043183
## [851,] 42.000720 15.569750
## [852,] 40.382347 17.188122
## [853,]  7.428649 50.141820
## [854,] 13.691033 43.879436
## [855,] 15.718955 41.851514
## [856,] 16.378958 41.191511
## [857,] 41.758494 15.811975
## [858,] 16.117102 41.453367
## [859,] 39.822562 17.747907
## [860,] 16.322066 41.248403
## [861,] 15.027980 42.542490
## [862,] 18.514879 39.055590
## [863,] 16.306859 41.263610
## [864,] 43.721618 13.848851
## [865,]  8.997124 48.573345
## [866,] 12.453205 45.117264
## [867,] 45.263275 12.307195
## [868,] 14.433005 43.137464
## [869,] 40.072182 17.498288
## [870,] 15.927920 41.642549
## [871,] 41.736655 15.833815
## [872,] 41.480181 16.090288
## [873,] 15.844739 41.725730
## [874,] 16.785236 40.785233
## [875,] 40.065990 17.504479
## [876,] 14.849806 42.720663
## [877,] 17.065346 40.505123
## [878,] 10.247147 47.323322
## [879,] 17.685187 39.885282
## [880,]  8.112941 49.457529
## [881,] 16.438828 41.131641
## [882,] 18.344187 39.226282
## [883,] 14.295648 43.274821
## [884,] 40.367789 17.202680
## [885,] 15.223388 42.347081
## [886,] 16.907704 40.662766
## [887,] 15.586970 41.983499
## [888,] 14.215714 43.354755
## [889,] 18.114428 39.456042
## [890,] 16.348660 41.221809
## [891,] 15.930001 41.640468
## [892,] 18.425627 39.144842
## [893,] 13.531725 44.038744
## [894,] 17.526021 40.044449
## [895,] 15.619177 41.951293
## [896,] 15.159378 42.411091
## [897,] 16.940786 40.629683
## [898,] 16.673553 40.896916
## [899,] 16.390306 41.180164
## [900,] 15.724532 41.845937
## [901,] 14.799221 42.771248
## [902,] 18.300022 39.270447
## [903,] 14.353778 43.216691
## [904,] 13.852387 43.718082
## [905,] 16.834420 40.736049
## [906,] 17.289087 40.281382
## [907,] 13.422934 44.147536
## [908,] 49.214369  8.356100
## [909,] 42.330540 15.239929
## [910,] 41.158148 16.412322
## [911,] 40.125715 17.444754
## [912,] 47.532105 10.038364
## 
## $prob
##             [,1]      [,2]
##   [1,] 0.7352562 0.2647438
##   [2,] 0.6859451 0.3140549
##   [3,] 0.7821968 0.2178032
##   [4,] 0.7339206 0.2660794
##   [5,] 0.7079705 0.2920295
##   [6,] 0.7262973 0.2737027
##   [7,] 0.7225211 0.2774789
##   [8,] 0.7795087 0.2204913
##   [9,] 0.8579665 0.1420335
##  [10,] 0.6803065 0.3196935
##  [11,] 0.7477151 0.2522849
##  [12,] 0.6781437 0.3218563
##  [13,] 0.8323316 0.1676684
##  [14,] 0.7014053 0.2985947
##  [15,] 0.8165286 0.1834714
##  [16,] 0.2608177 0.7391823
##  [17,] 0.2057298 0.7942702
##  [18,] 0.1551805 0.8448195
##  [19,] 0.7023116 0.2976884
##  [20,] 0.8257612 0.1742388
##  [21,] 0.7334636 0.2665364
##  [22,] 0.8007590 0.1992410
##  [23,] 0.7013260 0.2986740
##  [24,] 0.8023546 0.1976454
##  [25,] 0.6953082 0.3046918
##  [26,] 0.8076267 0.1923733
##  [27,] 0.7738702 0.2261298
##  [28,] 0.7121538 0.2878462
##  [29,] 0.7369449 0.2630551
##  [30,] 0.7245751 0.2754249
##  [31,] 0.2864847 0.7135153
##  [32,] 0.7067954 0.2932046
##  [33,] 0.7135621 0.2864379
##  [34,] 0.6937464 0.3062536
##  [35,] 0.7424970 0.2575030
##  [36,] 0.7652170 0.2347830
##  [37,] 0.2665338 0.7334662
##  [38,] 0.2636414 0.7363586
##  [39,] 0.3048394 0.6951606
##  [40,] 0.2688229 0.7311771
##  [41,] 0.1831769 0.8168231
##  [42,] 0.2474255 0.7525745
##  [43,] 0.7697557 0.2302443
##  [44,] 0.2679622 0.7320378
##  [45,] 0.7324915 0.2675085
##  [46,] 0.8038221 0.1961779
##  [47,] 0.1929801 0.8070199
##  [48,] 0.2655570 0.7344430
##  [49,] 0.3022889 0.6977111
##  [50,] 0.7105842 0.2894158
##  [51,] 0.1857006 0.8142994
##  [52,] 0.8110012 0.1889988
##  [53,] 0.2728869 0.7271131
##  [54,] 0.2506493 0.7493507
##  [55,] 0.7730840 0.2269160
##  [56,] 0.6961568 0.3038432
##  [57,] 0.2708847 0.7291153
##  [58,] 0.7138923 0.2861077
##  [59,] 0.3066124 0.6933876
##  [60,] 0.7070725 0.2929275
##  [61,] 0.7553821 0.2446179
##  [62,] 0.3065796 0.6934204
##  [63,] 0.2759944 0.7240056
##  [64,] 0.7790241 0.2209759
##  [65,] 0.7546206 0.2453794
##  [66,] 0.2871590 0.7128410
##  [67,] 0.6960371 0.3039629
##  [68,] 0.7020138 0.2979862
##  [69,] 0.7194690 0.2805310
##  [70,] 0.2783779 0.7216221
##  [71,] 0.2690162 0.7309838
##  [72,] 0.7389123 0.2610877
##  [73,] 0.3010863 0.6989137
##  [74,] 0.7254476 0.2745524
##  [75,] 0.7179043 0.2820957
##  [76,] 0.2729469 0.7270531
##  [77,] 0.2696379 0.7303621
##  [78,] 0.7078454 0.2921546
##  [79,] 0.7149314 0.2850686
##  [80,] 0.1341625 0.8658375
##  [81,] 0.3099585 0.6900415
##  [82,] 0.7202976 0.2797024
##  [83,] 0.2714720 0.7285280
##  [84,] 0.2896327 0.7103673
##  [85,] 0.3097948 0.6902052
##  [86,] 0.2706490 0.7293510
##  [87,] 0.2940818 0.7059182
##  [88,] 0.2718013 0.7281987
##  [89,] 0.2677356 0.7322644
##  [90,] 0.2767011 0.7232989
##  [91,] 0.7575320 0.2424680
##  [92,] 0.8029504 0.1970496
##  [93,] 0.7220636 0.2779364
##  [94,] 0.2820414 0.7179586
##  [95,] 0.6883219 0.3116781
##  [96,] 0.1675465 0.8324535
##  [97,] 0.2707618 0.7292382
##  [98,] 0.7116600 0.2883400
##  [99,] 0.7081424 0.2918576
## [100,] 0.2825639 0.7174361
## [101,] 0.6784978 0.3215022
## [102,] 0.7352189 0.2647811
## [103,] 0.2874086 0.7125914
## [104,] 0.7232253 0.2767747
## [105,] 0.2418880 0.7581120
## [106,] 0.2476798 0.7523202
## [107,] 0.2734910 0.7265090
## [108,] 0.1299486 0.8700514
## [109,] 0.6881303 0.3118697
## [110,] 0.2767350 0.7232650
## [111,] 0.2340059 0.7659941
## [112,] 0.1444539 0.8555461
## [113,] 0.2755758 0.7244242
## [114,] 0.2790622 0.7209378
## [115,] 0.2610790 0.7389210
## [116,] 0.3028692 0.6971308
## [117,] 0.2335539 0.7664461
## [118,] 0.1884096 0.8115904
## [119,] 0.2811942 0.7188058
## [120,] 0.2615274 0.7384726
## [121,] 0.6971603 0.3028397
## [122,] 0.2740684 0.7259316
## [123,] 0.2030865 0.7969135
## [124,] 0.1963638 0.8036362
## [125,] 0.3042015 0.6957985
## [126,] 0.7152687 0.2847313
## [127,] 0.3087922 0.6912078
## [128,] 0.6789209 0.3210791
## [129,] 0.2726406 0.7273594
## [130,] 0.6881900 0.3118100
## [131,] 0.7241930 0.2758070
## [132,] 0.2726099 0.7273901
## [133,] 0.2006760 0.7993240
## [134,] 0.6998717 0.3001283
## [135,] 0.7095092 0.2904908
## [136,] 0.2821181 0.7178819
## [137,] 0.2062683 0.7937317
## [138,] 0.1765193 0.8234807
## [139,] 0.2237792 0.7762208
## [140,] 0.6975502 0.3024498
## [141,] 0.2287687 0.7712313
## [142,] 0.1632495 0.8367505
## [143,] 0.7281330 0.2718670
## [144,] 0.8424278 0.1575722
## [145,] 0.7225380 0.2774620
## [146,] 0.2367717 0.7632283
## [147,] 0.2881953 0.7118047
## [148,] 0.6794895 0.3205105
## [149,] 0.2658415 0.7341585
## [150,] 0.2869089 0.7130911
## [151,] 0.2232699 0.7767301
## [152,] 0.2525085 0.7474915
## [153,] 0.2357005 0.7642995
## [154,] 0.7380869 0.2619131
## [155,] 0.3174177 0.6825823
## [156,] 0.7080855 0.2919145
## [157,] 0.6856225 0.3143775
## [158,] 0.7001350 0.2998650
## [159,] 0.8151586 0.1848414
## [160,] 0.7860496 0.2139504
## [161,] 0.3069449 0.6930551
## [162,] 0.6831266 0.3168734
## [163,] 0.2413487 0.7586513
## [164,] 0.7298137 0.2701863
## [165,] 0.1891569 0.8108431
## [166,] 0.2992935 0.7007065
## [167,] 0.7281378 0.2718622
## [168,] 0.7714745 0.2285255
## [169,] 0.7404177 0.2595823
## [170,] 0.2311786 0.7688214
## [171,] 0.7105385 0.2894615
## [172,] 0.7084727 0.2915273
## [173,] 0.7912675 0.2087325
## [174,] 0.2772646 0.7227354
## [175,] 0.1619016 0.8380984
## [176,] 0.1098542 0.8901458
## [177,] 0.2880209 0.7119791
## [178,] 0.7545375 0.2454625
## [179,] 0.7201753 0.2798247
## [180,] 0.2622998 0.7377002
## [181,] 0.2635231 0.7364769
## [182,] 0.6868494 0.3131506
## [183,] 0.2891135 0.7108865
## [184,] 0.2877927 0.7122073
## [185,] 0.7197773 0.2802227
## [186,] 0.7047148 0.2952852
## [187,] 0.2869416 0.7130584
## [188,] 0.2942428 0.7057572
## [189,] 0.2837102 0.7162898
## [190,] 0.7367966 0.2632034
## [191,] 0.2954235 0.7045765
## [192,] 0.2275313 0.7724687
## [193,] 0.3224772 0.6775228
## [194,] 0.2704900 0.7295100
## [195,] 0.7135887 0.2864113
## [196,] 0.3037840 0.6962160
## [197,] 0.7260138 0.2739862
## [198,] 0.6793229 0.3206771
## [199,] 0.7024898 0.2975102
## [200,] 0.7209358 0.2790642
## [201,] 0.7194634 0.2805366
## [202,] 0.8090988 0.1909012
## [203,] 0.6894222 0.3105778
## [204,] 0.6885965 0.3114035
## [205,] 0.6963584 0.3036416
## [206,] 0.2670932 0.7329068
## [207,] 0.2225523 0.7774477
## [208,] 0.2647121 0.7352879
## [209,] 0.2992042 0.7007958
## [210,] 0.1760127 0.8239873
## [211,] 0.7278292 0.2721708
## [212,] 0.2076513 0.7923487
## [213,] 0.2542866 0.7457134
## [214,] 0.6883860 0.3116140
## [215,] 0.1908199 0.8091801
## [216,] 0.2105511 0.7894489
## [217,] 0.7895635 0.2104365
## [218,] 0.2439705 0.7560295
## [219,] 0.7216484 0.2783516
## [220,] 0.6903812 0.3096188
## [221,] 0.1901399 0.8098601
## [222,] 0.2696729 0.7303271
## [223,] 0.1244225 0.8755775
## [224,] 0.2767746 0.7232254
## [225,] 0.2845646 0.7154354
## [226,] 0.2367173 0.7632827
## [227,] 0.2979274 0.7020726
## [228,] 0.2898072 0.7101928
## [229,] 0.2517135 0.7482865
## [230,] 0.1848676 0.8151324
## [231,] 0.1458994 0.8541006
## [232,] 0.1503106 0.8496894
## [233,] 0.6945207 0.3054793
## [234,] 0.2403908 0.7596092
## [235,] 0.7021025 0.2978975
## [236,] 0.2647462 0.7352538
## [237,] 0.7185263 0.2814737
## [238,] 0.2618098 0.7381902
## [239,] 0.2724274 0.7275726
## [240,] 0.2682572 0.7317428
## [241,] 0.1803455 0.8196545
## [242,] 0.1350553 0.8649447
## [243,] 0.1243218 0.8756782
## [244,] 0.2847611 0.7152389
## [245,] 0.1184873 0.8815127
## [246,] 0.2865372 0.7134628
## [247,] 0.2514061 0.7485939
## [248,] 0.1756502 0.8243498
## [249,] 0.2074044 0.7925956
## [250,] 0.1092836 0.8907164
## [251,] 0.7112454 0.2887546
## [252,] 0.2782223 0.7217777
## [253,] 0.2153788 0.7846212
## [254,] 0.2528789 0.7471211
## [255,] 0.6953370 0.3046630
## [256,] 0.2329943 0.7670057
## [257,] 0.8447556 0.1552444
## [258,] 0.2635239 0.7364761
## [259,] 0.2362117 0.7637883
## [260,] 0.1585276 0.8414724
## [261,] 0.7116523 0.2883477
## [262,] 0.7384135 0.2615865
## [263,] 0.6888667 0.3111333
## [264,] 0.3002923 0.6997077
## [265,] 0.2834322 0.7165678
## [266,] 0.1834258 0.8165742
## [267,] 0.2185622 0.7814378
## [268,] 0.6996754 0.3003246
## [269,] 0.2917710 0.7082290
## [270,] 0.7285665 0.2714335
## [271,] 0.2850324 0.7149676
## [272,] 0.7141363 0.2858637
## [273,] 0.2795727 0.7204273
## [274,] 0.2608354 0.7391646
## [275,] 0.2831820 0.7168180
## [276,] 0.7268317 0.2731683
## [277,] 0.2430327 0.7569673
## [278,] 0.2846753 0.7153247
## [279,] 0.2015715 0.7984285
## [280,] 0.2866106 0.7133894
## [281,] 0.2079256 0.7920744
## [282,] 0.7727014 0.2272986
## [283,] 0.7467163 0.2532837
## [284,] 0.7122979 0.2877021
## [285,] 0.2274097 0.7725903
## [286,] 0.2990592 0.7009408
## [287,] 0.1208649 0.8791351
## [288,] 0.2841664 0.7158336
## [289,] 0.2797131 0.7202869
## [290,] 0.1800615 0.8199385
## [291,] 0.2663149 0.7336851
## [292,] 0.3060433 0.6939567
## [293,] 0.7098073 0.2901927
## [294,] 0.7620860 0.2379140
## [295,] 0.7095188 0.2904812
## [296,] 0.7081265 0.2918735
## [297,] 0.8096024 0.1903976
## [298,] 0.1929541 0.8070459
## [299,] 0.6964000 0.3036000
## [300,] 0.7029460 0.2970540
## [301,] 0.2766527 0.7233473
## [302,] 0.2291930 0.7708070
## [303,] 0.2717411 0.7282589
## [304,] 0.2575623 0.7424377
## [305,] 0.2393663 0.7606337
## [306,] 0.8457933 0.1542067
## [307,] 0.2698942 0.7301058
## [308,] 0.3043118 0.6956882
## [309,] 0.2515074 0.7484926
## [310,] 0.6949087 0.3050913
## [311,] 0.2385895 0.7614105
## [312,] 0.7239644 0.2760356
## [313,] 0.1825253 0.8174747
## [314,] 0.2868880 0.7131120
## [315,] 0.7354478 0.2645522
## [316,] 0.7998025 0.2001975
## [317,] 0.3086442 0.6913558
## [318,] 0.2133192 0.7866808
## [319,] 0.2499199 0.7500801
## [320,] 0.1919660 0.8080340
## [321,] 0.2050585 0.7949415
## [322,] 0.3172054 0.6827946
## [323,] 0.2529645 0.7470355
## [324,] 0.2226537 0.7773463
## [325,] 0.1229910 0.8770090
## [326,] 0.2931100 0.7068900
## [327,] 0.2507301 0.7492699
## [328,] 0.2685738 0.7314262
## [329,] 0.7491080 0.2508920
## [330,] 0.2730588 0.7269412
## [331,] 0.2889110 0.7110890
## [332,] 0.2800986 0.7199014
## [333,] 0.2376217 0.7623783
## [334,] 0.2887455 0.7112545
## [335,] 0.6918283 0.3081717
## [336,] 0.3188081 0.6811919
## [337,] 0.7278075 0.2721925
## [338,] 0.2328618 0.7671382
## [339,] 0.2889863 0.7110137
## [340,] 0.2673224 0.7326776
## [341,] 0.2866420 0.7133580
## [342,] 0.3160264 0.6839736
## [343,] 0.2832331 0.7167669
## [344,] 0.2336691 0.7663309
## [345,] 0.2669135 0.7330865
## [346,] 0.1111564 0.8888436
## [347,] 0.2054982 0.7945018
## [348,] 0.7101558 0.2898442
## [349,] 0.2941258 0.7058742
## [350,] 0.7849341 0.2150659
## [351,] 0.2783437 0.7216563
## [352,] 0.7197278 0.2802722
## [353,] 0.2943952 0.7056048
## [354,] 0.2732375 0.7267625
## [355,] 0.2578616 0.7421384
## [356,] 0.1939560 0.8060440
## [357,] 0.3133253 0.6866747
## [358,] 0.2176330 0.7823670
## [359,] 0.3139451 0.6860549
## [360,] 0.6913630 0.3086370
## [361,] 0.2338537 0.7661463
## [362,] 0.2060150 0.7939850
## [363,] 0.6847841 0.3152159
## [364,] 0.2977415 0.7022585
## [365,] 0.7085118 0.2914882
## [366,] 0.3059415 0.6940585
## [367,] 0.2984777 0.7015223
## [368,] 0.3134777 0.6865223
## [369,] 0.2691008 0.7308992
## [370,] 0.2766268 0.7233732
## [371,] 0.7495891 0.2504109
## [372,] 0.2795885 0.7204115
## [373,] 0.2488802 0.7511198
## [374,] 0.2769361 0.7230639
## [375,] 0.3199571 0.6800429
## [376,] 0.2900010 0.7099990
## [377,] 0.7161088 0.2838912
## [378,] 0.2903250 0.7096750
## [379,] 0.2776695 0.7223305
## [380,] 0.2186786 0.7813214
## [381,] 0.1354965 0.8645035
## [382,] 0.2944614 0.7055386
## [383,] 0.2374276 0.7625724
## [384,] 0.3068857 0.6931143
## [385,] 0.2448284 0.7551716
## [386,] 0.2455051 0.7544949
## [387,] 0.7000337 0.2999663
## [388,] 0.2669568 0.7330432
## [389,] 0.2741914 0.7258086
## [390,] 0.1537994 0.8462006
## [391,] 0.2736279 0.7263721
## [392,] 0.1832228 0.8167772
## [393,] 0.2495760 0.7504240
## [394,] 0.8049634 0.1950366
## [395,] 0.7295532 0.2704468
## [396,] 0.2950122 0.7049878
## [397,] 0.1435621 0.8564379
## [398,] 0.2378135 0.7621865
## [399,] 0.2730385 0.7269615
## [400,] 0.6877589 0.3122411
## [401,] 0.7253457 0.2746543
## [402,] 0.2878572 0.7121428
## [403,] 0.6917186 0.3082814
## [404,] 0.2835146 0.7164854
## [405,] 0.2610363 0.7389637
## [406,] 0.2884147 0.7115853
## [407,] 0.7594452 0.2405548
## [408,] 0.1562802 0.8437198
## [409,] 0.2163124 0.7836876
## [410,] 0.7862238 0.2137762
## [411,] 0.2507015 0.7492985
## [412,] 0.6960545 0.3039455
## [413,] 0.2766682 0.7233318
## [414,] 0.7249664 0.2750336
## [415,] 0.7205114 0.2794886
## [416,] 0.2752234 0.7247766
## [417,] 0.2915598 0.7084402
## [418,] 0.3038828 0.6961172
## [419,] 0.6959469 0.3040531
## [420,] 0.2579414 0.7420586
## [421,] 0.2964253 0.7035747
## [422,] 0.1867232 0.8132768
## [423,] 0.2907765 0.7092235
## [424,] 0.1409219 0.8590781
## [425,] 0.2483156 0.7516844
## [426,] 0.2644305 0.7355695
## [427,] 0.6968912 0.3031088
## [428,] 0.2936871 0.7063129
## [429,] 0.2707459 0.7292541
## [430,] 0.2892783 0.7107217
## [431,] 0.2469272 0.7530728
## [432,] 0.3146479 0.6853521
## [433,] 0.2839765 0.7160235
## [434,] 0.2767044 0.7232956
## [435,] 0.2350463 0.7649537
## [436,] 0.2713054 0.7286946
## [437,] 0.2896199 0.7103801
## [438,] 0.2846999 0.7153001
## [439,] 0.2731354 0.7268646
## [440,] 0.2570627 0.7429373
## [441,] 0.2406162 0.7593838
## [442,] 0.2924142 0.7075858
## [443,] 0.2894943 0.7105057
## [444,] 0.3003117 0.6996883
## [445,] 0.2331566 0.7668434
## [446,] 0.8548544 0.1451456
## [447,] 0.7352822 0.2647178
## [448,] 0.7149177 0.2850823
## [449,] 0.6969843 0.3030157
## [450,] 0.6932913 0.3067087
## [451,] 0.8256335 0.1743665
## [452,] 0.2799847 0.7200153
## [453,] 0.7352562 0.2647438
## [454,] 0.6859451 0.3140549
## [455,] 0.7821968 0.2178032
## [456,] 0.7339206 0.2660794
## [457,] 0.7079705 0.2920295
## [458,] 0.7225211 0.2774789
## [459,] 0.7131693 0.2868307
## [460,] 0.8579665 0.1420335
## [461,] 0.6803065 0.3196935
## [462,] 0.7006926 0.2993074
## [463,] 0.8323316 0.1676684
## [464,] 0.8685189 0.1314811
## [465,] 0.7014053 0.2985947
## [466,] 0.8165286 0.1834714
## [467,] 0.7259490 0.2740510
## [468,] 0.1551805 0.8448195
## [469,] 0.7023116 0.2976884
## [470,] 0.7609049 0.2390951
## [471,] 0.8257612 0.1742388
## [472,] 0.7334636 0.2665364
## [473,] 0.8007590 0.1992410
## [474,] 0.7013260 0.2986740
## [475,] 0.8023546 0.1976454
## [476,] 0.6953082 0.3046918
## [477,] 0.8065739 0.1934261
## [478,] 0.7738702 0.2261298
## [479,] 0.7369449 0.2630551
## [480,] 0.7245751 0.2754249
## [481,] 0.7256586 0.2743414
## [482,] 0.2864847 0.7135153
## [483,] 0.7067954 0.2932046
## [484,] 0.7135621 0.2864379
## [485,] 0.6937464 0.3062536
## [486,] 0.7424970 0.2575030
## [487,] 0.7795437 0.2204563
## [488,] 0.7652170 0.2347830
## [489,] 0.7196694 0.2803306
## [490,] 0.2665338 0.7334662
## [491,] 0.2636414 0.7363586
## [492,] 0.2688229 0.7311771
## [493,] 0.1831769 0.8168231
## [494,] 0.2474255 0.7525745
## [495,] 0.6739448 0.3260552
## [496,] 0.7697557 0.2302443
## [497,] 0.7324915 0.2675085
## [498,] 0.8038221 0.1961779
## [499,] 0.2655570 0.7344430
## [500,] 0.1312182 0.8687818
## [501,] 0.3022889 0.6977111
## [502,] 0.1857006 0.8142994
## [503,] 0.8376642 0.1623358
## [504,] 0.8110012 0.1889988
## [505,] 0.2728869 0.7271131
## [506,] 0.2506493 0.7493507
## [507,] 0.3212052 0.6787948
## [508,] 0.2059164 0.7940836
## [509,] 0.6828608 0.3171392
## [510,] 0.2697077 0.7302923
## [511,] 0.7730840 0.2269160
## [512,] 0.6961568 0.3038432
## [513,] 0.2708847 0.7291153
## [514,] 0.7138923 0.2861077
## [515,] 0.7070725 0.2929275
## [516,] 0.7553821 0.2446179
## [517,] 0.2759944 0.7240056
## [518,] 0.3238207 0.6761793
## [519,] 0.7790241 0.2209759
## [520,] 0.7546206 0.2453794
## [521,] 0.2871590 0.7128410
## [522,] 0.6960371 0.3039629
## [523,] 0.7020138 0.2979862
## [524,] 0.3236648 0.6763352
## [525,] 0.2783779 0.7216221
## [526,] 0.2690162 0.7309838
## [527,] 0.7389123 0.2610877
## [528,] 0.3010863 0.6989137
## [529,] 0.7254476 0.2745524
## [530,] 0.2729469 0.7270531
## [531,] 0.2976378 0.7023622
## [532,] 0.2696379 0.7303621
## [533,] 0.7078454 0.2921546
## [534,] 0.7149314 0.2850686
## [535,] 0.1341625 0.8658375
## [536,] 0.2846890 0.7153110
## [537,] 0.3097848 0.6902152
## [538,] 0.3099585 0.6900415
## [539,] 0.7202976 0.2797024
## [540,] 0.3106143 0.6893857
## [541,] 0.7274872 0.2725128
## [542,] 0.3097948 0.6902052
## [543,] 0.2706490 0.7293510
## [544,] 0.2940818 0.7059182
## [545,] 0.2718013 0.7281987
## [546,] 0.2677356 0.7322644
## [547,] 0.2767011 0.7232989
## [548,] 0.8029504 0.1970496
## [549,] 0.2820414 0.7179586
## [550,] 0.6883219 0.3116781
## [551,] 0.7359025 0.2640975
## [552,] 0.1675465 0.8324535
## [553,] 0.2335413 0.7664587
## [554,] 0.7116600 0.2883400
## [555,] 0.7081424 0.2918576
## [556,] 0.2621410 0.7378590
## [557,] 0.7116397 0.2883603
## [558,] 0.2825639 0.7174361
## [559,] 0.7352189 0.2647811
## [560,] 0.2874086 0.7125914
## [561,] 0.7477005 0.2522995
## [562,] 0.2476798 0.7523202
## [563,] 0.6998819 0.3001181
## [564,] 0.1299486 0.8700514
## [565,] 0.6881303 0.3118697
## [566,] 0.2767350 0.7232650
## [567,] 0.2340059 0.7659941
## [568,] 0.1444539 0.8555461
## [569,] 0.7456052 0.2543948
## [570,] 0.2790622 0.7209378
## [571,] 0.2980124 0.7019876
## [572,] 0.1846741 0.8153259
## [573,] 0.2610790 0.7389210
## [574,] 0.3028692 0.6971308
## [575,] 0.2335539 0.7664461
## [576,] 0.1884096 0.8115904
## [577,] 0.2811942 0.7188058
## [578,] 0.2615274 0.7384726
## [579,] 0.2740684 0.7259316
## [580,] 0.2030865 0.7969135
## [581,] 0.1963638 0.8036362
## [582,] 0.3042015 0.6957985
## [583,] 0.6860476 0.3139524
## [584,] 0.7152687 0.2847313
## [585,] 0.3087922 0.6912078
## [586,] 0.2726406 0.7273594
## [587,] 0.2244813 0.7755187
## [588,] 0.6881900 0.3118100
## [589,] 0.7241930 0.2758070
## [590,] 0.2726099 0.7273901
## [591,] 0.2006760 0.7993240
## [592,] 0.6998717 0.3001283
## [593,] 0.7095092 0.2904908
## [594,] 0.2821181 0.7178819
## [595,] 0.2062683 0.7937317
## [596,] 0.1765193 0.8234807
## [597,] 0.2237792 0.7762208
## [598,] 0.2287687 0.7712313
## [599,] 0.1632495 0.8367505
## [600,] 0.7281330 0.2718670
## [601,] 0.8424278 0.1575722
## [602,] 0.2367717 0.7632283
## [603,] 0.6957614 0.3042386
## [604,] 0.2881953 0.7118047
## [605,] 0.6794895 0.3205105
## [606,] 0.2658415 0.7341585
## [607,] 0.2869089 0.7130911
## [608,] 0.2232699 0.7767301
## [609,] 0.7840474 0.2159526
## [610,] 0.2525085 0.7474915
## [611,] 0.2357005 0.7642995
## [612,] 0.7380869 0.2619131
## [613,] 0.7404732 0.2595268
## [614,] 0.7080855 0.2919145
## [615,] 0.6856225 0.3143775
## [616,] 0.7304069 0.2695931
## [617,] 0.3054630 0.6945370
## [618,] 0.7001350 0.2998650
## [619,] 0.8151586 0.1848414
## [620,] 0.7860496 0.2139504
## [621,] 0.3069449 0.6930551
## [622,] 0.2413487 0.7586513
## [623,] 0.7298137 0.2701863
## [624,] 0.3148176 0.6851824
## [625,] 0.1891569 0.8108431
## [626,] 0.2992935 0.7007065
## [627,] 0.7007171 0.2992829
## [628,] 0.7281378 0.2718622
## [629,] 0.7714745 0.2285255
## [630,] 0.7404177 0.2595823
## [631,] 0.2311786 0.7688214
## [632,] 0.7105385 0.2894615
## [633,] 0.7084727 0.2915273
## [634,] 0.1851362 0.8148638
## [635,] 0.3030432 0.6969568
## [636,] 0.7912675 0.2087325
## [637,] 0.2772646 0.7227354
## [638,] 0.1619016 0.8380984
## [639,] 0.2822299 0.7177701
## [640,] 0.2880209 0.7119791
## [641,] 0.7545375 0.2454625
## [642,] 0.7201753 0.2798247
## [643,] 0.2622998 0.7377002
## [644,] 0.2635231 0.7364769
## [645,] 0.6868494 0.3131506
## [646,] 0.2891135 0.7108865
## [647,] 0.2877927 0.7122073
## [648,] 0.7192815 0.2807185
## [649,] 0.7197773 0.2802227
## [650,] 0.2531153 0.7468847
## [651,] 0.2869416 0.7130584
## [652,] 0.1803224 0.8196776
## [653,] 0.2942428 0.7057572
## [654,] 0.7367966 0.2632034
## [655,] 0.2863515 0.7136485
## [656,] 0.2275313 0.7724687
## [657,] 0.2704900 0.7295100
## [658,] 0.3037840 0.6962160
## [659,] 0.7260138 0.2739862
## [660,] 0.7024898 0.2975102
## [661,] 0.7184215 0.2815785
## [662,] 0.7209358 0.2790642
## [663,] 0.7194634 0.2805366
## [664,] 0.7966812 0.2033188
## [665,] 0.8090988 0.1909012
## [666,] 0.6894222 0.3105778
## [667,] 0.6839221 0.3160779
## [668,] 0.7023691 0.2976309
## [669,] 0.6885965 0.3114035
## [670,] 0.6963584 0.3036416
## [671,] 0.6925619 0.3074381
## [672,] 0.2670932 0.7329068
## [673,] 0.2225523 0.7774477
## [674,] 0.2647121 0.7352879
## [675,] 0.2992042 0.7007958
## [676,] 0.1760127 0.8239873
## [677,] 0.2862152 0.7137848
## [678,] 0.7278292 0.2721708
## [679,] 0.2698341 0.7301659
## [680,] 0.6968770 0.3031230
## [681,] 0.2076513 0.7923487
## [682,] 0.2542866 0.7457134
## [683,] 0.6883860 0.3116140
## [684,] 0.1908199 0.8091801
## [685,] 0.2105511 0.7894489
## [686,] 0.2439705 0.7560295
## [687,] 0.7216484 0.2783516
## [688,] 0.6903812 0.3096188
## [689,] 0.1901399 0.8098601
## [690,] 0.2696729 0.7303271
## [691,] 0.1244225 0.8755775
## [692,] 0.2767746 0.7232254
## [693,] 0.2845646 0.7154354
## [694,] 0.2367173 0.7632827
## [695,] 0.2979274 0.7020726
## [696,] 0.2898072 0.7101928
## [697,] 0.2517135 0.7482865
## [698,] 0.1848676 0.8151324
## [699,] 0.1503106 0.8496894
## [700,] 0.2403908 0.7596092
## [701,] 0.2787356 0.7212644
## [702,] 0.2647462 0.7352538
## [703,] 0.7185263 0.2814737
## [704,] 0.2618098 0.7381902
## [705,] 0.2724274 0.7275726
## [706,] 0.2682572 0.7317428
## [707,] 0.1867576 0.8132424
## [708,] 0.1350553 0.8649447
## [709,] 0.1243218 0.8756782
## [710,] 0.2847611 0.7152389
## [711,] 0.1184873 0.8815127
## [712,] 0.2514061 0.7485939
## [713,] 0.1756502 0.8243498
## [714,] 0.2074044 0.7925956
## [715,] 0.7112454 0.2887546
## [716,] 0.2782223 0.7217777
## [717,] 0.2153788 0.7846212
## [718,] 0.2528789 0.7471211
## [719,] 0.6953370 0.3046630
## [720,] 0.2329943 0.7670057
## [721,] 0.8447556 0.1552444
## [722,] 0.2635239 0.7364761
## [723,] 0.1749388 0.8250612
## [724,] 0.7116523 0.2883477
## [725,] 0.7384135 0.2615865
## [726,] 0.6888667 0.3111333
## [727,] 0.3002923 0.6997077
## [728,] 0.2834322 0.7165678
## [729,] 0.1834258 0.8165742
## [730,] 0.2185622 0.7814378
## [731,] 0.7285665 0.2714335
## [732,] 0.2850324 0.7149676
## [733,] 0.7141363 0.2858637
## [734,] 0.2795727 0.7204273
## [735,] 0.2608354 0.7391646
## [736,] 0.2831820 0.7168180
## [737,] 0.1607181 0.8392819
## [738,] 0.2430327 0.7569673
## [739,] 0.2846753 0.7153247
## [740,] 0.2332016 0.7667984
## [741,] 0.2079256 0.7920744
## [742,] 0.7727014 0.2272986
## [743,] 0.2274097 0.7725903
## [744,] 0.2990592 0.7009408
## [745,] 0.1208649 0.8791351
## [746,] 0.2797131 0.7202869
## [747,] 0.1800615 0.8199385
## [748,] 0.2663149 0.7336851
## [749,] 0.3060433 0.6939567
## [750,] 0.2967459 0.7032541
## [751,] 0.7098073 0.2901927
## [752,] 0.3077779 0.6922221
## [753,] 0.7095188 0.2904812
## [754,] 0.7081265 0.2918735
## [755,] 0.8096024 0.1903976
## [756,] 0.6964000 0.3036000
## [757,] 0.7029460 0.2970540
## [758,] 0.2766527 0.7233473
## [759,] 0.2291930 0.7708070
## [760,] 0.2717411 0.7282589
## [761,] 0.2575623 0.7424377
## [762,] 0.2393663 0.7606337
## [763,] 0.8457933 0.1542067
## [764,] 0.2232631 0.7767369
## [765,] 0.2965215 0.7034785
## [766,] 0.2515074 0.7484926
## [767,] 0.6949087 0.3050913
## [768,] 0.1788178 0.8211822
## [769,] 0.2385895 0.7614105
## [770,] 0.7239644 0.2760356
## [771,] 0.1825253 0.8174747
## [772,] 0.2868880 0.7131120
## [773,] 0.7354478 0.2645522
## [774,] 0.7998025 0.2001975
## [775,] 0.3086442 0.6913558
## [776,] 0.2133192 0.7866808
## [777,] 0.2499199 0.7500801
## [778,] 0.2050585 0.7949415
## [779,] 0.3172054 0.6827946
## [780,] 0.7270123 0.2729877
## [781,] 0.2529645 0.7470355
## [782,] 0.2317670 0.7682330
## [783,] 0.2931100 0.7068900
## [784,] 0.2685738 0.7314262
## [785,] 0.7491080 0.2508920
## [786,] 0.2730588 0.7269412
## [787,] 0.2800986 0.7199014
## [788,] 0.2376217 0.7623783
## [789,] 0.2887455 0.7112545
## [790,] 0.6918283 0.3081717
## [791,] 0.3015153 0.6984847
## [792,] 0.2328618 0.7671382
## [793,] 0.2724853 0.7275147
## [794,] 0.2889863 0.7110137
## [795,] 0.2673224 0.7326776
## [796,] 0.2866420 0.7133580
## [797,] 0.2832331 0.7167669
## [798,] 0.2336691 0.7663309
## [799,] 0.2669135 0.7330865
## [800,] 0.2979237 0.7020763
## [801,] 0.1111564 0.8888436
## [802,] 0.2054982 0.7945018
## [803,] 0.7101558 0.2898442
## [804,] 0.2941258 0.7058742
## [805,] 0.7849341 0.2150659
## [806,] 0.7886460 0.2113540
## [807,] 0.2783437 0.7216563
## [808,] 0.7197278 0.2802722
## [809,] 0.2943952 0.7056048
## [810,] 0.2732375 0.7267625
## [811,] 0.2578616 0.7421384
## [812,] 0.1939560 0.8060440
## [813,] 0.7216352 0.2783648
## [814,] 0.2176330 0.7823670
## [815,] 0.2997830 0.7002170
## [816,] 0.6734846 0.3265154
## [817,] 0.6913630 0.3086370
## [818,] 0.2338537 0.7661463
## [819,] 0.2060150 0.7939850
## [820,] 0.6847841 0.3152159
## [821,] 0.2977415 0.7022585
## [822,] 0.7085118 0.2914882
## [823,] 0.3059415 0.6940585
## [824,] 0.2195402 0.7804598
## [825,] 0.3134777 0.6865223
## [826,] 0.2691008 0.7308992
## [827,] 0.2766268 0.7233732
## [828,] 0.2205161 0.7794839
## [829,] 0.7332582 0.2667418
## [830,] 0.1997405 0.8002595
## [831,] 0.2795885 0.7204115
## [832,] 0.2769361 0.7230639
## [833,] 0.3199571 0.6800429
## [834,] 0.2900010 0.7099990
## [835,] 0.7161088 0.2838912
## [836,] 0.2903250 0.7096750
## [837,] 0.2776695 0.7223305
## [838,] 0.2550442 0.7449558
## [839,] 0.2186786 0.7813214
## [840,] 0.2944614 0.7055386
## [841,] 0.2455051 0.7544949
## [842,] 0.7000337 0.2999663
## [843,] 0.2669568 0.7330432
## [844,] 0.2741914 0.7258086
## [845,] 0.2948568 0.7051432
## [846,] 0.2736279 0.7263721
## [847,] 0.1832228 0.8167772
## [848,] 0.2495760 0.7504240
## [849,] 0.8049634 0.1950366
## [850,] 0.2002292 0.7997708
## [851,] 0.7295532 0.2704468
## [852,] 0.7014420 0.2985580
## [853,] 0.1290358 0.8709642
## [854,] 0.2378135 0.7621865
## [855,] 0.2730385 0.7269615
## [856,] 0.2845028 0.7154972
## [857,] 0.7253457 0.2746543
## [858,] 0.2799543 0.7200457
## [859,] 0.6917186 0.3082814
## [860,] 0.2835146 0.7164854
## [861,] 0.2610363 0.7389637
## [862,] 0.3216038 0.6783962
## [863,] 0.2832504 0.7167496
## [864,] 0.7594452 0.2405548
## [865,] 0.1562802 0.8437198
## [866,] 0.2163124 0.7836876
## [867,] 0.7862238 0.2137762
## [868,] 0.2507015 0.7492985
## [869,] 0.6960545 0.3039455
## [870,] 0.2766682 0.7233318
## [871,] 0.7249664 0.2750336
## [872,] 0.7205114 0.2794886
## [873,] 0.2752234 0.7247766
## [874,] 0.2915598 0.7084402
## [875,] 0.6959469 0.3040531
## [876,] 0.2579414 0.7420586
## [877,] 0.2964253 0.7035747
## [878,] 0.1779931 0.8220069
## [879,] 0.3071920 0.6928080
## [880,] 0.1409219 0.8590781
## [881,] 0.2855427 0.7144573
## [882,] 0.3186388 0.6813612
## [883,] 0.2483156 0.7516844
## [884,] 0.7011892 0.2988108
## [885,] 0.2644305 0.7355695
## [886,] 0.2936871 0.7063129
## [887,] 0.2707459 0.7292541
## [888,] 0.2469272 0.7530728
## [889,] 0.3146479 0.6853521
## [890,] 0.2839765 0.7160235
## [891,] 0.2767044 0.7232956
## [892,] 0.3200535 0.6799465
## [893,] 0.2350463 0.7649537
## [894,] 0.3044273 0.6955727
## [895,] 0.2713054 0.7286946
## [896,] 0.2633186 0.7366814
## [897,] 0.2942617 0.7057383
## [898,] 0.2896199 0.7103801
## [899,] 0.2846999 0.7153001
## [900,] 0.2731354 0.7268646
## [901,] 0.2570627 0.7429373
## [902,] 0.3178717 0.6821283
## [903,] 0.2493254 0.7506746
## [904,] 0.2406162 0.7593838
## [905,] 0.2924142 0.7075858
## [906,] 0.3003117 0.6996883
## [907,] 0.2331566 0.7668434
## [908,] 0.8548544 0.1451456
## [909,] 0.7352822 0.2647178
## [910,] 0.7149177 0.2850823
## [911,] 0.6969843 0.3030157
## [912,] 0.8256335 0.1743665
## 
## $class
##   [1] "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "B" "B" "B"
##  [19] "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "B" "M" "M" "M" "M" "M"
##  [37] "B" "B" "B" "B" "B" "B" "M" "B" "M" "M" "B" "B" "B" "M" "B" "M" "B" "B"
##  [55] "M" "M" "B" "M" "B" "M" "M" "B" "B" "M" "M" "B" "M" "M" "M" "B" "B" "M"
##  [73] "B" "M" "M" "B" "B" "M" "M" "B" "B" "M" "B" "B" "B" "B" "B" "B" "B" "B"
##  [91] "M" "M" "M" "B" "M" "B" "B" "M" "M" "B" "M" "M" "B" "M" "B" "B" "B" "B"
## [109] "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "M"
## [127] "B" "M" "B" "M" "M" "B" "B" "M" "M" "B" "B" "B" "B" "M" "B" "B" "M" "M"
## [145] "M" "B" "B" "M" "B" "B" "B" "B" "B" "M" "B" "M" "M" "M" "M" "M" "B" "M"
## [163] "B" "M" "B" "B" "M" "M" "M" "B" "M" "M" "M" "B" "B" "B" "B" "M" "M" "B"
## [181] "B" "M" "B" "B" "M" "M" "B" "B" "B" "M" "B" "B" "B" "B" "M" "B" "M" "M"
## [199] "M" "M" "M" "M" "M" "M" "M" "B" "B" "B" "B" "B" "M" "B" "B" "M" "B" "B"
## [217] "M" "B" "M" "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B"
## [235] "M" "B" "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B"
## [253] "B" "B" "M" "B" "M" "B" "B" "B" "M" "M" "M" "B" "B" "B" "B" "M" "B" "M"
## [271] "B" "M" "B" "B" "B" "M" "B" "B" "B" "B" "B" "M" "M" "M" "B" "B" "B" "B"
## [289] "B" "B" "B" "B" "M" "M" "M" "M" "M" "B" "M" "M" "B" "B" "B" "B" "B" "M"
## [307] "B" "B" "B" "M" "B" "M" "B" "B" "M" "M" "B" "B" "B" "B" "B" "B" "B" "B"
## [325] "B" "B" "B" "B" "M" "B" "B" "B" "B" "B" "M" "B" "M" "B" "B" "B" "B" "B"
## [343] "B" "B" "B" "B" "B" "M" "B" "M" "B" "M" "B" "B" "B" "B" "B" "B" "B" "M"
## [361] "B" "B" "M" "B" "M" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "B" "M" "B"
## [379] "B" "B" "B" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "B" "B" "M" "M" "B"
## [397] "B" "B" "B" "M" "M" "B" "M" "B" "B" "B" "M" "B" "B" "M" "B" "M" "B" "M"
## [415] "M" "B" "B" "B" "M" "B" "B" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "B"
## [433] "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "M" "M" "M" "M"
## [451] "M" "B" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "B"
## [469] "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "M" "B" "M" "M" "M" "M"
## [487] "M" "M" "M" "B" "B" "B" "B" "B" "M" "M" "M" "M" "B" "B" "B" "B" "M" "M"
## [505] "B" "B" "B" "B" "M" "B" "M" "M" "B" "M" "M" "M" "B" "B" "M" "M" "B" "M"
## [523] "M" "B" "B" "B" "M" "B" "M" "B" "B" "B" "M" "M" "B" "B" "B" "B" "M" "B"
## [541] "M" "B" "B" "B" "B" "B" "B" "M" "B" "M" "M" "B" "B" "M" "M" "B" "M" "B"
## [559] "M" "B" "M" "B" "M" "B" "M" "B" "B" "B" "M" "B" "B" "B" "B" "B" "B" "B"
## [577] "B" "B" "B" "B" "B" "B" "M" "M" "B" "B" "B" "M" "M" "B" "B" "M" "M" "B"
## [595] "B" "B" "B" "B" "B" "M" "M" "B" "M" "B" "M" "B" "B" "B" "M" "B" "B" "M"
## [613] "M" "M" "M" "M" "B" "M" "M" "M" "B" "B" "M" "B" "B" "B" "M" "M" "M" "M"
## [631] "B" "M" "M" "B" "B" "M" "B" "B" "B" "B" "M" "M" "B" "B" "M" "B" "B" "M"
## [649] "M" "B" "B" "B" "B" "M" "B" "B" "B" "B" "M" "M" "M" "M" "M" "M" "M" "M"
## [667] "M" "M" "M" "M" "M" "B" "B" "B" "B" "B" "B" "M" "B" "M" "B" "B" "M" "B"
## [685] "B" "B" "M" "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B"
## [703] "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B" "B" "B" "M" "B"
## [721] "M" "B" "B" "M" "M" "M" "B" "B" "B" "B" "M" "B" "M" "B" "B" "B" "B" "B"
## [739] "B" "B" "B" "M" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B" "M" "M" "M" "M"
## [757] "M" "B" "B" "B" "B" "B" "M" "B" "B" "B" "M" "B" "B" "M" "B" "B" "M" "M"
## [775] "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "M" "B" "B" "B" "B" "M" "B" "B"
## [793] "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "M" "B" "M" "M" "B" "M" "B" "B"
## [811] "B" "B" "M" "B" "B" "M" "M" "B" "B" "M" "B" "M" "B" "B" "B" "B" "B" "B"
## [829] "M" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B" "B" "B" "M" "B" "B" "B" "B"
## [847] "B" "B" "M" "B" "M" "M" "B" "B" "B" "B" "M" "B" "M" "B" "B" "B" "B" "M"
## [865] "B" "B" "M" "B" "M" "B" "M" "M" "B" "B" "M" "B" "B" "B" "B" "B" "B" "B"
## [883] "B" "M" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B" "B"
## [901] "B" "B" "B" "B" "B" "B" "B" "M" "M" "M" "M" "M"
## 
## $importance
##   compactness_se  smoothness_mean smoothness_worst   symmetry_worst 
##         18.47356         17.49196         17.67883         16.51511 
##     texture_mean    texture_worst 
##         16.75694         13.08360 
## 
## $terms
## .outcome ~ texture_mean + smoothness_mean + compactness_se + 
##     texture_worst + smoothness_worst + symmetry_worst
## attr(,"variables")
## list(.outcome, texture_mean, smoothness_mean, compactness_se, 
##     texture_worst, smoothness_worst, symmetry_worst)
## attr(,"factors")
##                  texture_mean smoothness_mean compactness_se texture_worst
## .outcome                    0               0              0             0
## texture_mean                1               0              0             0
## smoothness_mean             0               1              0             0
## compactness_se              0               0              1             0
## texture_worst               0               0              0             1
## smoothness_worst            0               0              0             0
## symmetry_worst              0               0              0             0
##                  smoothness_worst symmetry_worst
## .outcome                        0              0
## texture_mean                    0              0
## smoothness_mean                 0              0
## compactness_se                  0              0
## texture_worst                   0              0
## smoothness_worst                1              0
## symmetry_worst                  0              1
## attr(,"term.labels")
## [1] "texture_mean"     "smoothness_mean"  "compactness_se"   "texture_worst"   
## [5] "smoothness_worst" "symmetry_worst"  
## attr(,"order")
## [1] 1 1 1 1 1 1
## attr(,"intercept")
## [1] 1
## attr(,"response")
## [1] 1
## attr(,".Environment")
## <environment: 0x000000003ec71dd8>
## attr(,"predvars")
## list(.outcome, texture_mean, smoothness_mean, compactness_se, 
##     texture_worst, smoothness_worst, symmetry_worst)
## attr(,"dataClasses")
##         .outcome     texture_mean  smoothness_mean   compactness_se 
##         "factor"        "numeric"        "numeric"        "numeric" 
##    texture_worst smoothness_worst   symmetry_worst 
##        "numeric"        "numeric"        "numeric" 
## 
## $call
## (function (formula, data, boos = TRUE, mfinal = 100, coeflearn = "Breiman", 
##     control, ...) 
## {
##     if (!(as.character(coeflearn) %in% c("Freund", "Breiman", 
##         "Zhu"))) {
##         stop("coeflearn must be 'Freund', 'Breiman' or 'Zhu' ")
##     }
##     formula <- as.formula(formula)
##     vardep <- data[, as.character(formula[[2]])]
##     n <- length(data[, 1])
##     nclases <- nlevels(vardep)
##     pesos <- rep(1/n, n)
##     guardarpesos <- array(0, c(n, mfinal))
##     w <- rep(1/n, n)
##     data <- cbind(pesos, data)
##     arboles <- list()
##     pond <- rep(0, mfinal)
##     pred <- data.frame(rep(0, n))
##     arboles[[1]] <- rpart(formula, data = data[, -1], control = rpart.control(minsplit = 1, 
##         cp = -1, maxdepth = 30))
##     nvar <- dim(varImp(arboles[[1]], surrogates = FALSE, competes = FALSE))[1]
##     imp <- array(0, c(mfinal, nvar))
##     for (m in 1:mfinal) {
##         if (boos == TRUE) {
##             k <- 1
##             while (k == 1) {
##                 boostrap <- sample(1:n, replace = TRUE, prob = pesos)
##                 fit <- rpart(formula, data = data[boostrap, -1], 
##                   control = control)
##                 k <- length(fit$frame$var)
##             }
##             flearn <- predict(fit, newdata = data[, -1], type = "class")
##             ind <- as.numeric(vardep != flearn)
##             err <- sum(pesos * ind)
##         }
##         if (boos == FALSE) {
##             w <<- pesos
##             fit <- rpart(formula = formula, data = data[, -1], 
##                 weights = w, control = control)
##             flearn <- predict(fit, data = data[, -1], type = "class")
##             ind <- as.numeric(vardep != flearn)
##             err <- sum(pesos * ind)
##         }
##         c <- log((1 - err)/err)
##         if (coeflearn == "Breiman") {
##             c <- (1/2) * c
##         }
##         if (coeflearn == "Zhu") {
##             c <- c + log(nclases - 1)
##         }
##         guardarpesos[, m] <- pesos
##         pesos <- pesos * exp(c * ind)
##         pesos <- pesos/sum(pesos)
##         maxerror <- 0.5
##         eac <- 0.001
##         if (coeflearn == "Zhu") {
##             maxerror <- 1 - 1/nclases
##         }
##         if (err >= maxerror) {
##             pesos <- rep(1/n, n)
##             maxerror <- maxerror - eac
##             c <- log((1 - maxerror)/maxerror)
##             if (coeflearn == "Breiman") {
##                 c <- (1/2) * c
##             }
##             if (coeflearn == "Zhu") {
##                 c <- c + log(nclases - 1)
##             }
##         }
##         if (err == 0) {
##             pesos <- rep(1/n, n)
##             c <- log((1 - eac)/eac)
##             if (coeflearn == "Breiman") {
##                 c <- (1/2) * c
##             }
##             if (coeflearn == "Zhu") {
##                 c <- c + log(nclases - 1)
##             }
##         }
##         arboles[[m]] <- fit
##         pond[m] <- c
##         if (m == 1) {
##             pred <- flearn
##         }
##         else {
##             pred <- data.frame(pred, flearn)
##         }
##         if (length(fit$frame$var) > 1) {
##             k <- varImp(fit, surrogates = FALSE, competes = FALSE)
##             imp[m, ] <- k[sort(row.names(k)), ]
##         }
##         else {
##             imp[m, ] <- rep(0, nvar)
##         }
##     }
##     classfinal <- array(0, c(n, nlevels(vardep)))
##     for (i in 1:nlevels(vardep)) {
##         classfinal[, i] <- matrix(as.numeric(pred == levels(vardep)[i]), 
##             nrow = n) %*% as.vector(pond)
##     }
##     predclass <- rep("O", n)
##     predclass[] <- apply(classfinal, 1, FUN = select, vardep.summary = summary(vardep))
##     imppond <- as.vector(as.vector(pond) %*% imp)
##     imppond <- imppond/sum(imppond) * 100
##     names(imppond) <- sort(row.names(k))
##     votosporc <- classfinal/apply(classfinal, 1, sum)
##     ans <- list(formula = formula, trees = arboles, weights = pond, 
##         votes = classfinal, prob = votosporc, class = predclass, 
##         importance = imppond)
##     attr(ans, "vardep.summary") <- summary(vardep, maxsum = 700)
##     mf <- model.frame(formula = formula, data = data[, -1])
##     terms <- attr(mf, "terms")
##     ans$terms <- terms
##     ans$call <- match.call()
##     class(ans) <- "boosting"
##     ans
## })(formula = .outcome ~ ., data = list(texture_mean = c(2.33988087773774, 
## 2.87751164216656, 3.05635689537043, 3.01455402779458, 2.66305283517147, 
## 2.75366071235426, 2.99473177322041, 3.08282698040492, 3.17971910966701, 
## 3.14587493198371, 2.88424189752063, 3.17596832385692, 3.11839228628988, 
## 3.0022112396517, 3.02916704964023, 2.66444656362008, 2.75429745226753, 
## 2.52091708731103, 2.65745841498615, 3.0624559055969, 2.79728133483015, 
## 3.06944731137627, 3.00815479355255, 3.2296179214001, 2.71137799119488, 
## 2.92852352386054, 3.17722014959937, 3.27601201623901, 2.88368276974537, 
## 3.07223024452672, 2.91343703082716, 3.22684399451738, 3.03591406318682, 
## 3.07176695982999, 3.21124679770371, 3.08236858021354, 2.82375700881418, 
## 2.68307421503203, 3.10458667846607, 3.07269331469012, 2.79361608943186, 
## 2.90361698464619, 3.09195113129453, 2.93119375241642, 2.92154737536461, 
## 3.07223024452672, 2.96062309644042, 2.46725171454928, 3.04356960296815, 
## 3.09783749649114, 2.62900699376176, 3.17555070012983, 3.04499851485691, 
## 2.94654202936322, 3.19948911106801, 2.75937682826755, 2.80457176809283, 
## 2.97807733831527, 2.39242579699384, 2.78192004966867, 3.17680304844629, 
## 2.89037175789616, 3.04309284491383, 3.21526932927409, 3.26918863874179, 
## 2.75047091698616, 2.91885122921803, 3.06619073720255, 3.2023398562281, 
## 2.7239235502585, 3.17888681665184, 3.12500460925813, 2.90690105984738, 
## 2.9871959425317, 3.13679771383259, 2.88144312715186, 2.55256529826182, 
## 2.98416563718253, 3.21807550469743, 2.59749101053515, 2.95958682691764, 
## 2.74470351875025, 2.91993056013771, 3.0568273729138, 2.83262493568384, 
## 3.03302805829769, 2.97807733831527, 3.00518743232475, 2.76190687389292, 
## 2.75747508442973, 2.8136106967627, 3.1315734964654, 2.99623214859564, 
## 2.38139627341834, 2.8402473707136, 2.38784493694487, 2.84549061022345, 
## 3.20639830335709, 2.93969088267037, 2.58701187272515, 2.96938829821439, 
## 3.06991167172824, 2.63404478779171, 3.08694315360738, 2.8136106967627, 
## 2.73371794785079, 2.59450815970308, 2.48240351956988, 2.89314568477889, 
## 2.85128436918812, 2.76757618041624, 2.70604819843154, 2.68444033546308, 
## 2.93225985059842, 3.03399098567108, 3.03013370027132, 2.73046379593911, 
## 2.57108434602905, 2.73046379593911, 2.88703285663065, 3.03206420280138, 
## 2.96836107675786, 2.54474665014402, 2.56186769092413, 3.00469201492546, 
## 2.89867056071086, 3.10099278421148, 3.09285898428471, 2.98365969231972, 
## 2.9338568698359, 3.20599319903719, 2.83026783382646, 2.51688969564105, 
## 2.97705900828837, 2.47569771070269, 2.68852753461335, 2.71800053195538, 
## 2.67069441455844, 2.89369954798884, 3.00121720378456, 3.10099278421148, 
## 2.56955412384829, 3.08511583468868, 3.27978275977172, 3.01111337559229, 
## 2.70270259477561, 2.71535677628465, 2.92208573338569, 2.84432781939476, 
## 2.85589532836619, 2.76631910922619, 3.06385810260159, 2.90251989183181, 
## 3.14458322028635, 2.79300390698237, 3.0837431508767, 3.11307076597122, 
## 2.96114082878437, 3.28353933819392, 3.16758253048065, 2.92316158071916, 
## 2.8142103969306, 2.84897089215859, 3.00864849882054, 2.55800220485855, 
## 2.94127608775793, 3.24102862950933, 3.17010566049877, 2.82908719614504, 
## 2.86105737022739, 3.0708397460408, 3.48031658611475, 3.00815479355255, 
## 2.83438912314523, 2.60046499042227, 2.73825604315928, 3.17680304844629, 
## 3.10593106585207, 2.94864066602014, 3.29879544804407, 3.52075661671979, 
## 3.32539566824587, 2.7669478423497, 3.05635689537043, 3.06619073720255, 
## 3.67071548348627, 2.74727091425549, 2.90087199253003, 3.1684242813721, 
## 3.15700042115011, 2.85819285953193, 2.64688376586472, 3.22763733053677, 
## 2.7033726115511, 3.15955035878339, 2.91506437048654, 2.98669152890184, 
## 2.83790818836042, 2.96165829322024, 3.35933317756346, 2.84897089215859, 
## 3.51333488159901, 3.29805662274264, 2.96424160646262, 3.09421922026864, 
## 2.94180393152844, 3.0837431508767, 2.78562833574758, 3.01504458458636, 
## 2.8225686545448, 3.04404613383254, 2.8541687092322, 2.65042108826557, 
## 2.99473177322041, 2.88144312715186, 2.71997877196748, 3.28091121578765, 
## 2.64048488160644, 2.90032208874933, 2.93225985059842, 2.91235066461494, 
## 3.03302805829769, 2.57413778351594, 2.99373027088332, 2.93863268151342, 
## 2.98214032003452, 2.94968833505258, 2.77383794164021, 2.85991255041146, 
## 2.62321826558551, 2.58550584834412, 2.51365606307399, 2.89811944468699, 
## 2.89977188240808, 2.9391619220656, 2.99021709286588, 3.17220341666977, 
## 2.92369907065416, 2.89922137317315, 3.19826487096408, 2.76127496233951, 
## 2.54238908520136, 2.62756295018952, 2.95021175825218, 2.75302356674494, 
## 2.59301339111385, 2.37211115564266, 2.92316158071916, 2.82435065679837, 
## 2.6447553507299, 2.93757335938046, 2.9391619220656, 2.83321334405622, 
## 2.78377591163035, 2.97858611471902, 2.58926666511224, 3.06851794327964, 
## 2.7219531062712, 2.85070650150373, 2.55567572067621, 3.03061667540749, 
## 3.08557297755378, 2.74148497718845, 2.96269241947579, 2.98870765861703, 
## 2.69327491552006, 2.94549105711724, 3.04452243772342, 2.65535241210176, 
## 3.06479180948549, 2.86391369893314, 3.18924101973851, 2.80578168959555, 
## 2.82375700881418, 2.70537997254633, 3.07639017657145, 2.68852753461335, 
## 2.9391619220656, 2.69056488676119, 2.70537997254633, 2.83732253680635, 
## 2.95595140354215, 2.85991255041146, 3.24804620216798, 2.6440448711263, 
## 2.92262380173335, 2.78562833574758, 2.74019465442878, 2.90799335924598, 
## 2.89425310460414, 3.07130346040107, 2.90635446240277, 3.08099211750481, 
## 3.28952066443753, 2.84781214347737, 3.08648663682246, 3.14802408389625, 
## 2.58097411853423, 2.71469474382088, 2.85359250639287, 2.77695417974942, 
## 2.77695417974942, 3.00667221359233, 3.33967652501391, 2.71800053195538, 
## 2.93545134266906, 2.56186769092413, 2.86105737022739, 2.61885462229774, 
## 3.14802408389625, 2.7408400239252, 3.14458322028635, 2.50307395374345, 
## 2.82375700881418, 2.99423114742772, 3.10368941505908, 2.87469394517693, 
## 2.84374591655611, 2.93863268151342, 2.85991255041146, 2.69665215614984, 
## 2.84839168565528, 2.38967979984498, 2.78315767358902, 2.7047112998367, 
## 2.92262380173335, 2.69867303928961, 3.06198806933106, 3.02819946369149, 
## 2.88591740754678, 2.86619290219901, 2.82316300820271, 3.07639017657145, 
## 3.09602999486936, 3.39484390768998, 3.05258508514677, 3.04832472367316, 
## 2.49897390699944, 2.94654202936322, 2.63762773680566, 2.77383794164021, 
## 2.95125778345216, 2.95073490762326, 3.05776766447344, 2.70671597808907, 
## 2.8106067894273, 2.87186828633161, 3.11484775444415, 2.8724340572095, 
## 2.97246364661464, 2.82967768922391, 2.97654945413722, 2.97246364661464, 
## 2.77133794033813, 2.97552956623647, 2.75110969056266, 2.84490938381941, 
## 2.75937682826755, 3.2144661163795, 3.3332753651767, 2.87130219517581, 
## 2.96217549002515, 3.02140002030257, 3.06991167172824, 3.2188758248682, 
## 3.3403852422654, 3.42491390827947, 3.37724616083964, 3.22882615572137, 
## 3.2240623515555, 3.30137704637994, 2.91017438519234, 2.90251989183181, 
## 3.0022112396517, 3.03206420280138, 2.89591193827178, 3.14974008603334, 
## 2.90032208874933, 2.91723004539903, 2.7033726115511, 3.40019688132857, 
## 2.74855214441154, 2.75556971707019, 3.02188723103084, 2.8106067894273, 
## 2.68033636253469, 2.97092715463502, 2.89203703721523, 2.95699144523756, 
## 2.64333388638252, 2.42303124606991, 2.797890905102, 2.82435065679837, 
## 2.93492013415723, 3.00568260440716, 3.1108450806545, 2.58248697812686, 
## 3.02237420450041, 3.00617753141553, 2.86334308550825, 3.05588619637374, 
## 2.79239134953596, 2.9871959425317, 2.55489902160804, 2.57566101305646, 
## 2.8402473707136, 3.17596832385692, 2.68716699018579, 2.68784749378469, 
## 3.02140002030257, 2.61447185414264, 2.94811641961233, 2.92369907065416, 
## 3.0243197304059, 3.00864849882054, 2.90251989183181, 2.81540871942271, 
## 2.63188884013665, 3.07269331469012, 2.9871959425317, 2.9274534328007, 
## 2.50715725872282, 2.57794151575519, 2.598235335095, 2.79300390698237, 
## 2.86903462050803, 3.03783344957263, 3.19622113430339, 3.23828621838802, 
## 3.23632273847192, 2.67000213346468, 3.21847574484686, 3.23553626576131, 
## 3.03013370027132, 2.79422789734326, 2.96217549002515, 3.1108450806545, 
## 3.38201456224538, 3.08831145484708, 3.36453339729056, 3.32790958589232, 
## 3.12148347885955, 3.17513290192028, 3.30137704637994, 3.37997374521053, 
## 3.42165339022954, 3.22246936037833, 3.10861443061066, 3.34109345759245, 
## 3.33505757915761, 3.37861088298936, 3.20030443928277, 2.33988087773774, 
## 2.87751164216656, 3.05635689537043, 3.01455402779458, 2.66305283517147, 
## 2.99473177322041, 3.03639425527288, 3.17971910966701, 3.14587493198371, 
## 3.21084365317094, 3.11839228628988, 3.31563949330051, 3.0022112396517, 
## 3.02916704964023, 3.09783749649114, 2.52091708731103, 2.65745841498615, 
## 3.13723183582769, 3.0624559055969, 2.79728133483015, 3.06944731137627, 
## 3.00815479355255, 3.2296179214001, 2.71137799119488, 3.22326617316949, 
## 3.17722014959937, 2.88368276974537, 3.07223024452672, 3.07823349506573, 
## 2.91343703082716, 3.22684399451738, 3.03591406318682, 3.07176695982999, 
## 3.21124679770371, 3.00963517872298, 3.08236858021354, 2.86789890204411, 
## 2.82375700881418, 2.68307421503203, 3.07269331469012, 2.79361608943186, 
## 2.90361698464619, 2.92852352386054, 3.09195113129453, 2.92154737536461, 
## 3.07223024452672, 2.46725171454928, 2.70001802940495, 3.04356960296815, 
## 2.62900699376176, 3.17136484219715, 3.17555070012983, 3.04499851485691, 
## 2.94654202936322, 2.85243910372751, 2.80275413657151, 3.05917644611053, 
## 2.68375750853317, 3.19948911106801, 2.75937682826755, 2.80457176809283, 
## 2.97807733831527, 2.78192004966867, 3.17680304844629, 3.04309284491383, 
## 2.7638002162067, 3.21526932927409, 3.26918863874179, 2.75047091698616, 
## 2.91885122921803, 3.06619073720255, 3.08190996979504, 2.7239235502585, 
## 3.17888681665184, 3.12500460925813, 2.90690105984738, 2.9871959425317, 
## 2.88144312715186, 2.99272776453369, 2.55256529826182, 2.98416563718253, 
## 3.21807550469743, 2.59749101053515, 3.02140002030257, 2.96527306606928, 
## 2.95958682691764, 2.74470351875025, 2.90853906185161, 2.97909463240097, 
## 2.83262493568384, 3.03302805829769, 2.97807733831527, 3.00518743232475, 
## 2.76190687389292, 2.75747508442973, 3.1315734964654, 2.38139627341834, 
## 2.8402473707136, 3.00568260440716, 2.38784493694487, 2.79667139275574, 
## 3.20639830335709, 2.93969088267037, 2.79667139275574, 3.2236643416, 
## 2.58701187272515, 3.06991167172824, 2.63404478779171, 3.11218108619724, 
## 2.73371794785079, 2.86619290219901, 2.48240351956988, 2.89314568477889, 
## 2.85128436918812, 2.76757618041624, 2.70604819843154, 2.80819714970715, 
## 2.93225985059842, 2.71997877196748, 2.88535921607262, 3.03399098567108, 
## 3.03013370027132, 2.73046379593911, 2.57108434602905, 2.73046379593911, 
## 2.88703285663065, 2.96836107675786, 2.54474665014402, 2.56186769092413, 
## 3.00469201492546, 2.76883167336207, 2.89867056071086, 3.10099278421148, 
## 2.98365969231972, 2.27315628230323, 2.9338568698359, 3.20599319903719, 
## 2.83026783382646, 2.51688969564105, 2.97705900828837, 2.47569771070269, 
## 2.68852753461335, 2.71800053195538, 2.67069441455844, 2.89369954798884, 
## 3.10099278421148, 2.56955412384829, 3.08511583468868, 3.27978275977172, 
## 2.70270259477561, 3.10950728781284, 2.71535677628465, 2.92208573338569, 
## 2.84432781939476, 2.85589532836619, 2.76631910922619, 3.14069804380418, 
## 3.06385810260159, 2.90251989183181, 3.14458322028635, 3.10413814739778, 
## 3.0837431508767, 3.11307076597122, 3.00667221359233, 2.97348666460667, 
## 2.96114082878437, 3.28353933819392, 3.16758253048065, 2.92316158071916, 
## 2.84897089215859, 3.00864849882054, 3.11529150861163, 2.55800220485855, 
## 2.94127608775793, 2.91614779421115, 3.24102862950933, 3.17010566049877, 
## 2.82908719614504, 2.86105737022739, 3.0708397460408, 3.48031658611475, 
## 2.57718192589717, 2.63188884013665, 3.00815479355255, 2.83438912314523, 
## 2.60046499042227, 2.74148497718845, 3.17680304844629, 3.10593106585207, 
## 2.94864066602014, 3.29879544804407, 3.52075661671979, 3.32539566824587, 
## 2.7669478423497, 3.05635689537043, 3.29472513715164, 3.06619073720255, 
## 3.32683296637329, 2.74727091425549, 2.71071331852169, 2.90087199253003, 
## 3.15700042115011, 2.98870765861703, 2.64688376586472, 2.7033726115511, 
## 2.91506437048654, 2.98669152890184, 2.96165829322024, 2.83615020372953, 
## 3.35933317756346, 2.84897089215859, 3.14415227867226, 3.51333488159901, 
## 3.29805662274264, 3.13809951484091, 3.09693415406296, 2.96424160646262, 
## 3.09421922026864, 3.43785069931019, 2.94180393152844, 3.0837431508767, 
## 2.78562833574758, 3.01504458458636, 2.8225686545448, 2.56802155649851, 
## 3.04404613383254, 2.75174805636793, 3.19785645764413, 2.8541687092322, 
## 2.65042108826557, 2.99473177322041, 2.88144312715186, 2.71997877196748, 
## 2.64048488160644, 2.90032208874933, 2.93225985059842, 2.91235066461494, 
## 3.03302805829769, 2.57413778351594, 2.99373027088332, 2.93863268151342, 
## 2.98214032003452, 2.94968833505258, 2.77383794164021, 2.85991255041146, 
## 2.62321826558551, 2.51365606307399, 2.89977188240808, 3.1393996233664, 
## 2.99021709286588, 3.17220341666977, 2.92369907065416, 2.89922137317315, 
## 3.19826487096408, 2.66722820658195, 2.54238908520136, 2.62756295018952, 
## 2.95021175825218, 2.75302356674494, 2.37211115564266, 2.92316158071916, 
## 2.82435065679837, 2.93757335938046, 2.9391619220656, 2.83321334405622, 
## 2.78377591163035, 2.97858611471902, 2.58926666511224, 3.06851794327964, 
## 2.7219531062712, 2.88647528761704, 3.03061667540749, 3.08557297755378, 
## 2.74148497718845, 2.96269241947579, 2.98870765861703, 2.69327491552006, 
## 2.94549105711724, 3.06479180948549, 2.86391369893314, 3.18924101973851, 
## 2.80578168959555, 2.82375700881418, 2.70537997254633, 2.73760900334375, 
## 2.68852753461335, 2.9391619220656, 2.77446196662146, 2.83732253680635, 
## 2.95595140354215, 2.6440448711263, 2.92262380173335, 2.78562833574758, 
## 2.90799335924598, 2.89425310460414, 3.07130346040107, 2.90635446240277, 
## 2.83026783382646, 3.08099211750481, 2.89148225218019, 2.84781214347737, 
## 3.08648663682246, 3.14802408389625, 2.71469474382088, 2.85359250639287, 
## 2.77695417974942, 2.77695417974942, 3.00667221359233, 3.33967652501391, 
## 2.71800053195538, 2.93545134266906, 2.7033726115511, 3.12324559385295, 
## 2.61885462229774, 3.14802408389625, 2.78253905309295, 2.7408400239252, 
## 3.14458322028635, 2.50307395374345, 2.82375700881418, 2.99423114742772, 
## 3.10368941505908, 2.87469394517693, 2.84374591655611, 2.93863268151342, 
## 2.69665215614984, 2.84839168565528, 3.04547436544881, 2.38967979984498, 
## 2.90635446240277, 2.92262380173335, 3.06198806933106, 3.02819946369149, 
## 2.88591740754678, 2.82316300820271, 3.07639017657145, 3.09602999486936, 
## 3.39484390768998, 3.07731226054641, 2.49897390699944, 3.06385810260159, 
## 2.94654202936322, 2.63762773680566, 2.77383794164021, 2.95073490762326, 
## 3.05776766447344, 2.70671597808907, 3.09013294897548, 2.8106067894273, 
## 2.87186828633161, 3.11484775444415, 2.8724340572095, 2.97246364661464, 
## 3.08967788639652, 2.82967768922391, 2.97654945413722, 2.97246364661464, 
## 2.77133794033813, 2.97552956623647, 2.75110969056266, 3.23553626576131, 
## 2.75937682826755, 2.90799335924598, 2.82435065679837, 3.3332753651767, 
## 2.87130219517581, 2.96217549002515, 3.02140002030257, 3.06991167172824, 
## 3.2188758248682, 3.3403852422654, 2.84199817361195, 3.37724616083964, 
## 3.22882615572137, 3.2240623515555, 3.33932197794407, 3.26842760369745, 
## 3.29546642702991, 2.91017438519234, 3.0022112396517, 3.03206420280138, 
## 2.89591193827178, 3.14974008603334, 2.90032208874933, 2.91723004539903, 
## 3.33719205168624, 2.7033726115511, 2.74855214441154, 2.68033636253469, 
## 2.97092715463502, 2.89203703721523, 2.95699144523756, 2.87016905057865, 
## 2.42303124606991, 2.797890905102, 2.82435065679837, 2.93492013415723, 
## 2.78315767358902, 3.00568260440716, 3.02334744058696, 2.55178617862755, 
## 3.02237420450041, 3.00617753141553, 2.85128436918812, 3.05588619637374, 
## 2.81780106506133, 2.9871959425317, 2.55489902160804, 2.57566101305646, 
## 2.99773027621666, 2.75366071235426, 3.17596832385692, 2.68716699018579, 
## 2.68784749378469, 3.02140002030257, 2.61447185414264, 2.94811641961233, 
## 2.92369907065416, 3.0243197304059, 3.00864849882054, 2.90251989183181, 
## 2.81540871942271, 3.07269331469012, 2.9871959425317, 2.9274534328007, 
## 2.57261223020711, 2.93119375241642, 2.598235335095, 2.86562358820697, 
## 2.99673177388707, 2.79300390698237, 3.02868337369368, 2.86903462050803, 
## 3.19622113430339, 3.23828621838802, 2.67000213346468, 3.21847574484686, 
## 3.23553626576131, 3.03013370027132, 3.14544454678232, 2.79422789734326, 
## 2.80819714970715, 2.96217549002515, 3.18676577094997, 3.06712226964066, 
## 3.1108450806545, 3.38201456224538, 3.08831145484708, 3.36453339729056, 
## 3.31817802594206, 2.97501923195645, 3.32790958589232, 3.12148347885955, 
## 3.30137704637994, 3.37997374521053, 3.42165339022954, 3.22246936037833, 
## 3.10861443061066, 3.34109345759245, 3.37861088298936), smoothness_mean = c(-2.13368655653223, 
## -2.46816753378372, -2.21091790446822, -1.94841327927343, -2.29958958401425, 
## -2.0572887370387, -2.357780728462, -2.06120877341878, -2.13199879241851, 
## -2.5003045919681, -2.33201390368486, -2.47681943960538, -2.17948289586006, 
## -2.31597433011306, -2.14558134418438, -2.32493295665795, -2.23026443141442, 
## -2.27886856637673, -2.23212662934548, -2.18836394890402, -2.13199879241851, 
## -2.24999264287488, -2.36021420583068, -2.22377391256976, -2.31800334572243, 
## -2.19912638462582, -2.12276666641821, -2.36435411939168, -2.26336437984076, 
## -2.34236596300589, -2.40983628374102, -2.36584443263324, -2.28671174383776, 
## -2.50568094900448, -2.3989858672804, -2.33160204206454, -2.45340798272863, 
## -2.27205588795922, -2.43588794030847, -2.44911488568994, -2.56589980899753, 
## -2.49362454040772, -2.40174266452726, -2.35135525736347, -2.25094185984221, 
## -2.17419187822565, -2.51825662946955, -2.32769779380912, -2.08505728046547, 
## -2.25474776357989, -2.56122629666141, -2.14387340183922, -2.2595256035336, 
## -2.50850286364319, -2.23399230152843, -2.29560947925762, -2.38901482099243, 
## -2.38945102601571, -2.04716798112954, -2.23961029383266, -2.05104846717862, 
## -2.30920696930293, -2.20545838226332, -2.24148999363423, -2.10784101620153, 
## -2.33067597316057, -2.31526514615142, -2.35979056676483, -2.40472856666275, 
## -2.17859911321305, -2.4108386784343, -2.3859667019331, -2.28082360121253, 
## -2.26432638087696, -2.39931628195382, -2.25856820757727, -2.40983628374102, 
## -2.32769779380912, -2.35514234373272, -2.14558134418438, -2.30368569843808, 
## -1.96754244918243, -2.46781357236182, -2.43508844280714, -2.26625316374666, 
## -2.30930763875487, -2.54631407791736, -2.18747228589354, -2.16282315061889, 
## -2.35788640877914, -2.15244243456433, -2.15848474902029, -2.4767004132409, 
## -2.36733697022374, -2.24999264287488, -2.2063662352535, -2.43212446434903, 
## -2.37968214337901, -2.49896500703904, -2.23867176725039, -2.21457421567133, 
## -2.29461692334487, -2.35788640877914, -2.36127408934273, -2.25284300109923, 
## -2.33935281718626, -2.1507227436848, -2.38054663446376, -2.33088169214916, 
## -2.21457421567133, -2.44449433917674, -2.55194429112667, -2.16108553072035, 
## -2.50862573640859, -2.17595244206068, -2.3639287232351, -2.23305903034544, 
## -2.32749272870023, -2.36616407463692, -2.4471485441854, -2.19373068808196, 
## -2.59762751985212, -2.3739736890817, -2.58826916278315, -2.21732524904322, 
## -2.18925640768704, -2.29065652212877, -2.47230636781226, -2.47444159994024, 
## -2.42305924646192, -2.25474776357989, -2.31719124538341, -2.27496992596107, 
## -2.40262644717427, -2.07385716338594, -2.2966030213165, -2.43132796888677, 
## -2.39272864284701, -2.33314739857669, -2.3196295276033, -2.77242873503842, 
## -2.43737441934268, -2.21274438899426, -2.17068002211411, -2.3437196363526, 
## -2.40141144730094, -2.37871048338354, -2.45480430597156, -2.32544438714013, 
## -2.29560947925762, -2.51577831345509, -2.43623077786396, -2.66642852641139, 
## -2.2595256035336, -2.53313097407502, -2.60761680378094, -2.46240179444793, 
## -2.41150750021823, -2.17068002211411, -2.02268320786123, -2.30609123232333, 
## -2.42181918091774, -2.21732524904322, -2.43360535543245, -2.50323356648088, 
## -2.42238265644964, -2.2966030213165, -2.357780728462, -2.27691734624547, 
## -2.51900132355883, -2.36627064468018, -2.47456035773386, -2.27789248040367, 
## -2.47159563572833, -2.31202955182205, -2.25094185984221, -2.53792775176525, 
## -2.2182439445603, -2.17068002211411, -2.67611558257186, -2.55361384779779, 
## -2.39043318356724, -2.46934831087215, -2.40019792186105, -2.48231002394073, 
## -2.32156405959185, -2.36201667676347, -2.34424076826288, -2.52036803806616, 
## -2.27594316204762, -2.62900799376226, -2.43497428103979, -2.33748714500331, 
## -2.2896688677275, -2.29560947925762, -2.37032874508057, -2.2424311701743, 
## -2.29461692334487, -2.26818366627671, -2.37946613733, -2.01365380114183, 
## -2.24148999363423, -2.30258509299405, -2.54593135162578, -2.3303674740065, 
## -2.3342821797373, -2.53124382499637, -2.36180445265402, -2.22377391256976, 
## -2.74435118082854, -2.36435411939168, -2.09964424899736, -2.36669703846129, 
## -2.41653797307008, -2.53224986129852, -2.35219559354738, -2.28278246569787, 
## -2.54938117297348, -2.26625316374666, -2.23867176725039, -2.4777721608874, 
## -2.45282675632459, -2.66570936061269, -2.52323176410967, -2.44035402273894, 
## -2.435316805448, -2.40883489283676, -2.29759755148301, -2.48027738140434, 
## -2.33645216250562, -2.38618411687036, -2.46298861448707, -2.30579022394299, 
## -2.72174352823421, -2.16282315061889, -2.47088540842575, -2.22562405185792, 
## -2.23679735245604, -2.42441380135918, -2.5937398549248, -2.46381074149327, 
## -2.6069386997338, -2.48266915484781, -2.4288288195683, -2.57465631793168, 
## -2.43110051522947, -2.45375688079578, -2.23119509690737, -2.46381074149327, 
## -2.55954399280299, -2.32831324156678, -2.30579022394299, -2.58269589994951, 
## -2.2433732333622, -2.52323176410967, -2.17683388768849, -2.14558134418438, 
## -2.44495543428577, -2.27496992596107, -2.37418851185374, -2.14643641050411, 
## -2.14986400597638, -2.35482620483974, -2.34559698358734, -2.24999264287488, 
## -2.48819204077953, -2.48734963143914, -2.19014966366426, -2.35788640877914, 
## -2.39513907460465, -2.29560947925762, -2.23586146095114, -2.32780034213511, 
## -2.46734181860853, -2.27011790285654, -2.32309396962559, -2.31445526556453, 
## -2.47860653723952, -2.42193185062661, -2.15589071384324, -2.58216672941196, 
## -2.08505728046547, -2.16369309412743, -2.27886856637673, -2.62086383942329, 
## -2.22377391256976, -2.43691680578934, -2.48975840051902, -2.29362535162257, 
## -2.59883711612889, -2.45550319941876, -2.33448864200705, -2.39141630670066, 
## -2.31213050583758, -2.36616407463692, -2.24148999363423, -2.32872375060374, 
## -2.53036437271278, -2.30158559266096, -2.35957881451417, -2.48867373635724, 
## -2.31465767422831, -2.3998672158006, -2.58800306379276, -2.49277754416857, 
## -2.10701830945007, -2.08989599958369, -2.26144314966287, -2.4813529715887, 
## -2.44391826911069, -2.4811138515493, -2.29263476214088, -2.30298517301539, 
## -2.26432638087696, -2.15416508787577, -2.14814873968963, -2.27399763614213, 
## -2.52011940554375, -2.24526002637478, -2.52024371407769, -2.55863930776551, 
## -2.39832536527488, -2.42226993594003, -2.31475889392575, -2.44391826911069, 
## -2.29859307172451, -2.35440484172384, -2.5834901811668, -2.26721794915675, 
## -2.44357278629502, -2.42384918048708, -2.22840569481979, -2.52961117157248, 
## -2.46334087187344, -2.48650793115497, -2.32554670463139, -2.18925640768704, 
## -2.43212446434903, -2.45970684876626, -2.27205588795922, -2.2182439445603, 
## -2.40163224659528, -2.23026443141442, -2.51120955820905, -2.32115647857948, 
## -2.50702956687577, -2.53881388385691, -2.30789918781781, -2.24999264287488, 
## -2.17771611094818, -2.41642590941781, -2.24431618487007, -2.39294753307444, 
## -2.47005744693779, -2.44368793397205, -2.52998770122864, -2.41743493467299, 
## -2.4284885098832, -2.2730262907525, -2.30288513800305, -2.38825191975114, 
## -2.47836807294842, -2.33469514691228, -2.71613277729557, -2.27108642593467, 
## -2.4725433908046, -2.381087321149, -2.36904545969432, -2.43144171511905, 
## -2.48063578293735, -2.3128374694584, -2.4641632886501, -2.5948108054957, 
## -2.49084424245475, -2.44472486015517, -2.4875902474172, -2.37633928158271, 
## -2.14131694539792, -2.41396368097744, -2.51355306837812, -2.56407973569218, 
## -2.29560947925762, -2.40351101158401, -2.41564181528634, -2.6841383810559, 
## -2.25761172735131, -2.27691734624547, -2.39832536527488, -2.52685427759577, 
## -2.23399230152843, -2.26048391697541, -2.352405787978, -2.44865186912883, 
## -2.21732524904322, -2.59066724546717, -2.50201211769094, -2.54618648621065, 
## -2.61251277439315, -2.34476217189323, -2.29065652212877, -2.22100510600162, 
## -2.15589071384324, -2.37064982390818, -1.81155409655623, -2.0754495204103, 
## -2.12527607802364, -2.1345315079978, -2.51342958872124, -2.46840357768899, 
## -2.2018351898939, -2.31952781372436, -2.3843375948663, -2.25474776357989, 
## -2.23679735245604, -2.26625316374666, -2.10537492370634, -2.18480205733766, 
## -1.98704469241387, -2.2730262907525, -2.46381074149327, -2.31102057181515, 
## -2.40761233086175, -2.08104282304681, -2.20727491318972, -2.37763196731696, 
## -2.33438540554384, -2.25761172735131, -2.09070473395855, -2.51355306837812, 
## -2.44553210231398, -2.30418637436102, -2.42622273345222, -2.49193126472496, 
## -2.34570138406767, -2.36085000112602, -2.46616340697086, -2.34695504072999, 
## -2.49181042610479, -2.3816283003345, -2.51047064191927, -2.51047064191927, 
## -2.46828554877176, -2.37935815179996, -2.30971041793663, -2.59709039079395, 
## -2.2557015070952, -2.20818441757256, -2.1982250776698, -2.32483070194137, 
## -2.47041220363755, -2.13876700776465, -2.94446897961645, -2.13368655653223, 
## -2.46816753378372, -2.21091790446822, -1.94841327927343, -2.29958958401425, 
## -2.357780728462, -2.1294724752854, -2.13199879241851, -2.5003045919681, 
## -2.32892906833365, -2.17948289586006, -2.172434408529, -2.31597433011306, 
## -2.14558134418438, -2.3196295276033, -2.27886856637673, -2.23212662934548, 
## -2.36148620091421, -2.18836394890402, -2.13199879241851, -2.24999264287488, 
## -2.36021420583068, -2.22377391256976, -2.31800334572243, -2.24054970207459, 
## -2.12276666641821, -2.26336437984076, -2.34236596300589, -2.32044361129536, 
## -2.40983628374102, -2.36584443263324, -2.28671174383776, -2.50568094900448, 
## -2.3989858672804, -2.26240330336121, -2.33160204206454, -2.20818441757256, 
## -2.45340798272863, -2.27205588795922, -2.44911488568994, -2.56589980899753, 
## -2.49362454040772, -2.16456379509667, -2.40174266452726, -2.25094185984221, 
## -2.17419187822565, -2.32769779380912, -2.17683388768849, -2.08505728046547, 
## -2.56122629666141, -2.18747228589354, -2.14387340183922, -2.2595256035336, 
## -2.50850286364319, -2.23867176725039, -2.3196295276033, -2.40694610831879, 
## -2.3245240005128, -2.23399230152843, -2.29560947925762, -2.38901482099243, 
## -2.38945102601571, -2.23961029383266, -2.05104846717862, -2.20545838226332, 
## -2.22747762050724, -2.24148999363423, -2.10784101620153, -2.33067597316057, 
## -2.31526514615142, -2.35979056676483, -2.43360535543245, -2.17859911321305, 
## -2.4108386784343, -2.3859667019331, -2.28082360121253, -2.26432638087696, 
## -2.25856820757727, -2.27886856637673, -2.40983628374102, -2.32769779380912, 
## -2.35514234373272, -2.14558134418438, -2.52410496319216, -2.29759755148301, 
## -2.30368569843808, -1.96754244918243, -2.16980398176023, -2.02041820123037, 
## -2.26625316374666, -2.30930763875487, -2.54631407791736, -2.18747228589354, 
## -2.16282315061889, -2.35788640877914, -2.15848474902029, -2.36733697022374, 
## -2.24999264287488, -1.9330926453447, -2.2063662352535, -2.64296495444628, 
## -2.37968214337901, -2.49896500703904, -2.16282315061889, -2.2876964805003, 
## -2.23867176725039, -2.29461692334487, -2.35788640877914, -2.40185309465271, 
## -2.33935281718626, -2.14814873968963, -2.38054663446376, -2.33088169214916, 
## -2.21457421567133, -2.44449433917674, -2.55194429112667, -2.21549038614311, 
## -2.50862573640859, -2.30558960201434, -2.53275325924522, -2.17595244206068, 
## -2.3639287232351, -2.23305903034544, -2.32749272870023, -2.36616407463692, 
## -2.4471485441854, -2.59762751985212, -2.3739736890817, -2.58826916278315, 
## -2.21732524904322, -2.44253705342149, -2.18925640768704, -2.29065652212877, 
## -2.47444159994024, -2.34403228290822, -2.42305924646192, -2.25474776357989, 
## -2.31719124538341, -2.27496992596107, -2.40262644717427, -2.07385716338594, 
## -2.2966030213165, -2.43132796888677, -2.39272864284701, -2.33314739857669, 
## -2.77242873503842, -2.43737441934268, -2.21274438899426, -2.17068002211411, 
## -2.40141144730094, -2.40163224659528, -2.37871048338354, -2.45480430597156, 
## -2.32544438714013, -2.29560947925762, -2.51577831345509, -2.23026443141442, 
## -2.43623077786396, -2.66642852641139, -2.2595256035336, -2.12026353620009, 
## -2.60761680378094, -2.46240179444793, -2.31546771882506, -2.34486648525065, 
## -2.41150750021823, -2.17068002211411, -2.02268320786123, -2.30609123232333, 
## -2.21732524904322, -2.43360535543245, -2.30058709033137, -2.50323356648088, 
## -2.42238265644964, -2.16980398176023, -2.2966030213165, -2.357780728462, 
## -2.27691734624547, -2.51900132355883, -2.36627064468018, -2.47456035773386, 
## -2.33862686434131, -2.25284300109923, -2.27789248040367, -2.47159563572833, 
## -2.31202955182205, -2.48039683431017, -2.53792775176525, -2.2182439445603, 
## -2.17068002211411, -2.67611558257186, -2.55361384779779, -2.39043318356724, 
## -2.46934831087215, -2.40019792186105, -2.35293146743078, -2.48231002394073, 
## -2.49823507999933, -2.36201667676347, -2.53502169121985, -2.34424076826288, 
## -2.27594316204762, -2.23492644452023, -2.43497428103979, -2.2896688677275, 
## -2.37032874508057, -2.2424311701743, -2.26818366627671, -2.21091790446822, 
## -2.37946613733, -2.01365380114183, -2.19912638462582, -2.24148999363423, 
## -2.30258509299405, -2.44622454319566, -2.40805672593628, -2.54593135162578, 
## -2.3303674740065, -2.35714688098673, -2.3342821797373, -2.53124382499637, 
## -2.36180445265402, -2.22377391256976, -2.74435118082854, -2.31932441699834, 
## -2.36435411939168, -2.40384292506827, -2.42418791475329, -2.09964424899736, 
## -2.36669703846129, -2.41653797307008, -2.53224986129852, -2.35219559354738, 
## -2.54938117297348, -2.26625316374666, -2.23867176725039, -2.4777721608874, 
## -2.45282675632459, -2.66570936061269, -2.52323176410967, -2.44035402273894, 
## -2.435316805448, -2.40883489283676, -2.29759755148301, -2.48027738140434, 
## -2.33645216250562, -2.46298861448707, -2.72174352823421, -2.2876964805003, 
## -2.47088540842575, -2.22562405185792, -2.23679735245604, -2.42441380135918, 
## -2.5937398549248, -2.65854600619912, -2.6069386997338, -2.48266915484781, 
## -2.4288288195683, -2.57465631793168, -2.45375688079578, -2.23119509690737, 
## -2.46381074149327, -2.32831324156678, -2.30579022394299, -2.58269589994951, 
## -2.2433732333622, -2.52323176410967, -2.17683388768849, -2.14558134418438, 
## -2.44495543428577, -2.56616009010259, -2.14643641050411, -2.14986400597638, 
## -2.35482620483974, -2.34559698358734, -2.24999264287488, -2.48819204077953, 
## -2.48734963143914, -2.39513907460465, -2.29560947925762, -2.23586146095114, 
## -2.32780034213511, -2.46734181860853, -2.27011790285654, -2.16282315061889, 
## -2.31445526556453, -2.47860653723952, -2.3995366190705, -2.58216672941196, 
## -2.08505728046547, -2.62086383942329, -2.22377391256976, -2.43691680578934, 
## -2.29362535162257, -2.59883711612889, -2.45550319941876, -2.33448864200705, 
## -2.53363481587935, -2.39141630670066, -2.38260280098008, -2.36616407463692, 
## -2.24148999363423, -2.32872375060374, -2.30158559266096, -2.35957881451417, 
## -2.48867373635724, -2.31465767422831, -2.3998672158006, -2.58800306379276, 
## -2.49277754416857, -2.10701830945007, -2.52735496605284, -2.6685891322213, 
## -2.4813529715887, -2.44391826911069, -2.65555263214446, -2.4811138515493, 
## -2.29263476214088, -2.30298517301539, -2.26432638087696, -2.15416508787577, 
## -2.14814873968963, -2.27399763614213, -2.52011940554375, -2.24526002637478, 
## -2.55863930776551, -2.39832536527488, -2.09557092360972, -2.42226993594003, 
## -2.61033382759614, -2.29859307172451, -2.5834901811668, -2.26721794915675, 
## -2.44357278629502, -2.22840569481979, -2.52961117157248, -2.46334087187344, 
## -2.48650793115497, -2.2595256035336, -2.43212446434903, -2.28474517486571, 
## -2.45970684876626, -2.27205588795922, -2.2182439445603, -2.23026443141442, 
## -2.51120955820905, -2.32115647857948, -2.43030483459642, -2.50702956687577, 
## -2.53881388385691, -2.30789918781781, -2.24999264287488, -2.17771611094818, 
## -2.28474517486571, -2.41642590941781, -2.24431618487007, -2.39294753307444, 
## -2.47005744693779, -2.44368793397205, -2.52998770122864, -2.48518668899532, 
## -2.4284885098832, -2.5081343359073, -2.41385190542263, -2.30288513800305, 
## -2.38825191975114, -2.47836807294842, -2.33469514691228, -2.71613277729557, 
## -2.27108642593467, -2.4725433908046, -2.45538668325201, -2.36904545969432, 
## -2.43144171511905, -2.48063578293735, -2.52773064697893, -2.22192718997659, 
## -2.65997457787065, -2.4641632886501, -2.49084424245475, -2.44472486015517, 
## -2.4875902474172, -2.37633928158271, -2.14131694539792, -2.41396368097744, 
## -2.43588794030847, -2.51355306837812, -2.29560947925762, -2.25761172735131, 
## -2.27691734624547, -2.39832536527488, -2.52685427759577, -2.30759763481759, 
## -2.26048391697541, -2.352405787978, -2.44865186912883, -2.21732524904322, 
## -2.18213893991818, -2.59066724546717, -2.30158559266096, -2.60748114618051, 
## -2.61251277439315, -2.34476217189323, -2.41597778034914, -2.22100510600162, 
## -2.31435407659404, -2.37064982390818, -1.81155409655623, -2.0754495204103, 
## -2.21091790446822, -2.36159227357408, -2.1345315079978, -2.51342958872124, 
## -2.46840357768899, -2.2018351898939, -2.31952781372436, -2.3843375948663, 
## -2.25474776357989, -2.23679735245604, -2.26625316374666, -2.10537492370634, 
## -2.18480205733766, -2.2730262907525, -2.46381074149327, -2.31102057181515, 
## -2.26721794915675, -2.23026443141442, -2.20727491318972, -2.23212662934548, 
## -2.28671174383776, -2.37763196731696, -2.39076078389777, -2.33438540554384, 
## -2.09070473395855, -2.51355306837812, -2.30418637436102, -2.42622273345222, 
## -2.49193126472496, -2.34570138406767, -2.38097916042206, -2.36085000112602, 
## -2.42170652390191, -2.46616340697086, -2.50201211769094, -2.59950974681276, 
## -2.34695504072999, -2.49181042610479, -2.3816283003345, -2.51047064191927, 
## -2.4046178185592, -2.29958958401425, -2.51047064191927, -2.46828554877176, 
## -2.30971041793663, -2.59709039079395, -2.2557015070952, -2.20818441757256, 
## -2.1982250776698, -2.32483070194137, -2.13876700776465), compactness_se = c(-3.01511898735418, 
## -4.33667093295308, -3.21737694874447, -2.59588290423146, -3.70460241457917, 
## -3.3977034924079, -4.28163846064261, -3.35183595212443, -2.62873083189796, 
## -4.68107978008844, -3.20374093728393, -3.46541595398881, -2.82413468012301, 
## -4.45502752755837, -3.68847953409261, -4.22673375026785, -3.96436948580036, 
## -4.24609811744964, -2.93219425275, -3.9728351448249, -3.2704323117826, 
## -3.48839059336453, -3.60380328175882, -3.48773614307939, -3.49561795728165, 
## -3.37728556161584, -3.47959144909017, -3.40580799421984, -3.55155526325072, 
## -3.68927953413528, -5.3187241763257, -4.5153294819883, -3.79914084837147, 
## -4.50804347525737, -2.2966030213165, -4.28019232879261, -4.10682208373321, 
## -4.24959584749339, -4.28163846064261, -4.62966781758849, -4.44220135770995, 
## -4.78190736448814, -4.57561138374655, -4.74190670956156, -3.76965576414122, 
## -3.55609834288012, -4.75680736065105, -4.55162941906006, -3.45396544720081, 
## -2.6512918672836, -3.23449720577116, -3.76792266145439, -4.04270132907026, 
## -4.6868141750286, -2.87955051926458, -3.8800399595751, -4.00688328645211, 
## -3.81535011816789, -3.56171830849726, -2.84061063668669, -2.68311371581218, 
## -4.09775035535693, -4.0710187369186, -2.86593283601915, -2.80511191394534, 
## -4.0107389783673, -4.10500114241063, -3.51224068045548, -3.99431824815498, 
## -3.12084208459684, -4.0074331902328, -3.71153414467874, -4.20572315020549, 
## -3.29279150781837, -3.35756334464209, -4.44050356443286, -4.31999124375443, 
## -3.54218507496313, -4.20773724957719, -4.52451228297064, -3.80811354203763, 
## -3.53633005565365, -4.55924125409969, -4.16240929313623, -3.52948540196369, 
## -3.20843056700666, -2.59749321053757, -3.28421466617654, -3.81082112491918, 
## -2.69414729593322, -3.66399173860616, -3.22489389719376, -4.7769079492424, 
## -4.18055625904117, -3.86275676412523, -4.36144000106549, -4.69192704991437, 
## -4.44475346458219, -3.60050234349652, -3.7066361757032, -4.2104290412429, 
## -3.93631569799719, -4.18975474702676, -4.25310585460704, -4.28308668681898, 
## -4.18580217271094, -3.35297880939168, -5.17503835038787, -3.94351367251952, 
## -4.05416277258927, -4.15856313454875, -4.02799518168057, -3.04892210206811, 
## -3.02063990926018, -4.47238907475427, -2.91581265037812, -2.34486648525065, 
## -4.71219875874661, -3.90355892701602, -4.1592031345706, -3.00497485492092, 
## -3.62646819399254, -4.62649597270284, -5.09390821329008, -3.72762028243037, 
## -3.78231139890649, -3.44986255364344, -3.67143254051022, -4.83044099831848, 
## -3.70095203534821, -3.31483621601411, -4.29035944614806, -4.43965574751052, 
## -4.54407508662828, -3.76317205869579, -3.85328251040555, -4.58831306892167, 
## -4.82743911989164, -2.42950978656771, -3.19564769874899, -6.09593656870469, 
## -5.0570981685066, -3.67418790537095, -3.04513262384912, -3.78847881933669, 
## -3.28929828916475, -4.42284862919414, -4.69061914782057, -4.74397258886855, 
## -4.73573455600921, -3.811273102328, -4.02295456613543, -5.00028900204211, 
## -2.97162519821076, -4.14332474444382, -2.93821836735338, -3.29737805302316, 
## -3.66165346786034, -2.97182045350768, -3.47119075348269, -3.95754352072996, 
## -3.95336596896253, -4.38282695484465, -4.19770707521725, -4.39328982695259, 
## -4.05878438682355, -2.45865427864393, -3.29413830936875, -3.37028027902743, 
## -3.4673371841667, -3.45175410527931, -3.63287726399865, -3.74059388980062, 
## -4.50623023813319, -4.2488953220707, -4.82071766068067, -3.77487316728091, 
## -3.25502127150455, -3.90803098415861, -4.10621473479259, -4.98892272719061, 
## -3.88149380039332, -4.56594947283481, -4.28091513330547, -3.46317917401804, 
## -3.58092231260884, -4.38362791604086, -2.82784776455781, -3.62421594078821, 
## -3.42589999430253, -4.1547317122372, -2.88383318105658, -4.57076875927076, 
## -4.39978335626314, -3.24214445180756, -4.5421953868267, -2.98439676389025, 
## -4.19770707521725, -4.01738352108597, -3.04387264888193, -3.08172598672097, 
## -3.6667266494728, -4.14901196356441, -4.69893156580257, -4.4178610876831, 
## -3.3295280949718, -3.7272045684356, -3.82676316147732, -3.03968416108317, 
## -5.59672340236279, -3.00376444525126, -4.20706543228622, -4.71075294856316, 
## -4.497213044483, -5.24496619105104, -4.17273862965011, -3.70949040801806, 
## -4.23913914712533, -3.81716747441547, -3.35785057715879, -4.82731423696922, 
## -2.96500910091115, -4.30877617293429, -2.49350349701352, -4.27227577071476, 
## -2.24054970207459, -3.85611537348985, -3.91052412930441, -4.19970507787993, 
## -4.43965574751052, -4.80936932130136, -4.40549999085952, -4.60018264447705, 
## -4.28526296624157, -3.44108231536107, -3.39055422156368, -3.05082223987195, 
## -4.67109633610426, -3.62985611366448, -3.75630213043137, -4.84584145417052, 
## -5.58706667496024, -5.35785507926423, -4.69838256771027, -5.11250192049983, 
## -3.58704514823267, -4.31250057202527, -4.26655738486777, -5.80515096904449, 
## -5.15560326434886, -4.09895517477978, -2.71961683747368, -4.52913549971209, 
## -3.15472800504445, -4.28598944647694, -3.90405483577897, -3.71686749065411, 
## -4.25592275787816, -4.65499088097205, -4.4811842062071, -3.87136102370926, 
## -3.28181562464207, -3.42805483583204, -3.42497791032756, -4.50623023813319, 
## -4.94428591355472, -4.77572128718313, -4.01905157818667, -3.77008950953948, 
## -3.24496315963152, -4.23429652264955, -3.92662915781751, -3.48904547223535, 
## -3.36072744910844, -3.97549542838372, -3.39085108960801, -3.06379700446812, 
## -4.51350299746227, -4.19571305661039, -3.72264308426675, -4.96313167145229, 
## -2.72433203367649, -3.1598996197679, -3.71686749065411, -3.38522561215246, 
## -3.23602198370317, -4.83596795623916, -3.54080433604857, -4.71253270431659, 
## -5.36168324216986, -3.88977239649333, -4.01460959420325, -3.96016338075608, 
## -3.06315494987143, -4.4990099901597, -3.56807875396823, -3.2398441313044, 
## -4.20975541373343, -3.62122080564012, -3.9659513474537, -4.14332474444382, 
## -4.0107389783673, -2.57137998972734, -4.42035174899555, -3.51190550364658, 
## -3.09026275740022, -4.03024363681558, -3.29548692724004, -3.87617335253445, 
## -4.05013630755778, -2.70770026233792, -3.19491518742336, -4.29401575735117, 
## -3.92916916426312, -3.53085058988903, -3.28983491380869, -4.11597686234921, 
## -4.36379386641282, -3.93375749757415, -3.35900033296237, -4.22058828879635, 
## -3.93018697605582, -4.41952083910146, -4.35441146764091, -4.76874849996491, 
## -3.84717203310853, -4.38523176562283, -3.1055471395612, -3.58560110466043, 
## -3.76792266145439, -4.61048428081186, -4.67184431925608, -3.62159470730834, 
## -3.57269769982453, -4.24959584749339, -3.48904547223535, -3.24701813081448, 
## -4.37168034261974, -3.88879487930894, -2.94808562008664, -3.90952612522956, 
## -3.57234163786598, -3.97124236508812, -5.16781555848388, -3.25269117336293, 
## -5.07309597543886, -4.44475346458219, -2.77852631490653, -3.41276401803362, 
## -3.60637823260219, -4.09534506255568, -4.02239606290952, -4.29621597826077, 
## -4.34665949083659, -4.51441582271963, -4.68442986697372, -3.07088694715475, 
## -4.46020441573791, -3.89960048542959, -3.90455099058945, -4.44731210137251, 
## -3.88830647881083, -3.87520903230543, -2.8029654588985, -3.9481684520645, 
## -3.65389848521869, -4.38523176562283, -3.72222943379549, -4.81811591232586, 
## -4.71342377142835, -3.80811354203763, -4.07395387257437, -4.20438266743104, 
## -2.70950125336952, -3.44860350405943, -4.37644225637998, -2.65569497076828, 
## -3.44014613180073, -3.91753818611626, -4.03419063940235, -4.79839092365105, 
## -3.47602869228516, -4.01960821610808, -3.41489070879416, -4.33133352035836, 
## -3.85658829854005, -2.9058915695542, -4.09474464224344, -4.5263590055638, 
## -4.23222826958549, -4.00633368489939, -2.59414132699443, -4.27371049119042, 
## -3.56383396513971, -4.23636906225152, -4.41869061904547, -4.79463721094292, 
## -3.88635525867957, -4.28962978540791, -3.43765391867277, -3.57877023143391, 
## -4.15600722261421, -3.39442039738313, -3.09180274230472, -3.0183869641188, 
## -3.75160628402224, -3.00780485478826, -3.24009946131983, -4.42535175941225, 
## -3.78759542705309, -4.0107389783673, -4.11659017116942, -4.10986437388014, 
## -3.97549542838372, -3.77095756514954, -3.75973119508589, -4.1598435444548, 
## -3.67300610495765, -3.43827638990358, -5.13280292807046, -3.7937956240635, 
## -4.69192704991437, -3.42743869286738, -4.46280294470117, -4.92979290625216, 
## -3.38788636760248, -3.64352375239314, -3.35383681030232, -4.63645447602786, 
## -2.74000537234376, -3.19126065783523, -3.06765801312558, -4.44645849483327, 
## -3.8295217839164, -4.92716768885499, -4.48916651023178, -3.48970078025356, 
## -4.39571996180588, -3.99867081215382, -3.83830796760587, -4.48827643451659, 
## -3.07067136021848, -3.51257596964554, -3.62009993871911, -4.72417897257029, 
## -3.02742936029725, -3.14423228187243, -3.54356772295388, -3.7201637441537, 
## -3.28849389174907, -2.78741813648578, -5.36873983084458, -3.01511898735418, 
## -4.33667093295308, -3.21737694874447, -2.59588290423146, -3.70460241457917, 
## -4.28163846064261, -3.49693765394299, -2.62873083189796, -4.68107978008844, 
## -2.48927618230594, -2.82413468012301, -3.16060691674423, -4.45502752755837, 
## -3.68847953409261, -3.96700731376091, -4.24609811744964, -2.93219425275, 
## -4.37485243092588, -3.9728351448249, -3.2704323117826, -3.48839059336453, 
## -3.60380328175882, -3.48773614307939, -3.49561795728165, -3.38907120186431, 
## -3.47959144909017, -3.55155526325072, -3.68927953413528, -3.50822595442068, 
## -5.3187241763257, -4.5153294819883, -3.79914084837147, -4.50804347525737, 
## -2.2966030213165, -3.84109866709145, -4.28019232879261, -3.22037695099447, 
## -4.10682208373321, -4.24959584749339, -4.62966781758849, -4.44220135770995, 
## -4.78190736448814, -3.51964313686864, -4.57561138374655, -3.76965576414122, 
## -3.55609834288012, -4.55162941906006, -4.51076951056661, -3.45396544720081, 
## -3.23449720577116, -3.63136554791333, -3.76792266145439, -4.04270132907026, 
## -4.6868141750286, -2.45271055157168, -5.18284811674192, -4.1031835108893, 
## -2.36787055969268, -2.87955051926458, -3.8800399595751, -4.00688328645211, 
## -3.81535011816789, -2.84061063668669, -2.68311371581218, -4.0710187369186, 
## -3.33120500984212, -2.86593283601915, -2.80511191394534, -4.0107389783673, 
## -4.10500114241063, -3.51224068045548, -3.69168338144667, -3.12084208459684, 
## -4.0074331902328, -3.71153414467874, -4.20572315020549, -3.29279150781837, 
## -4.44050356443286, -4.2246810639501, -4.31999124375443, -3.54218507496313, 
## -4.20773724957719, -4.52451228297064, -5.09979443041607, -3.81853266234081, 
## -3.80811354203763, -3.53633005565365, -3.76792266145439, -2.44553210231398, 
## -3.52948540196369, -3.20843056700666, -2.59749321053757, -3.28421466617654, 
## -3.81082112491918, -2.69414729593322, -3.22489389719376, -4.18055625904117, 
## -3.86275676412523, -2.32217574275905, -4.36144000106549, -3.42038020107893, 
## -4.44475346458219, -3.60050234349652, -3.17366348204665, -3.44860350405943, 
## -3.7066361757032, -3.93631569799719, -4.18975474702676, -4.42118334987508, 
## -4.18580217271094, -3.35813789220171, -5.17503835038787, -3.94351367251952, 
## -4.05416277258927, -4.15856313454875, -4.02799518168057, -3.31511143274903, 
## -3.02063990926018, -3.77356626228113, -4.1401790985659, -4.47238907475427, 
## -2.91581265037812, -2.34486648525065, -4.71219875874661, -3.90355892701602, 
## -4.1592031345706, -3.62646819399254, -4.62649597270284, -5.09390821329008, 
## -3.72762028243037, -3.48839059336453, -3.78231139890649, -3.44986255364344, 
## -4.83044099831848, -4.62374157157353, -3.70095203534821, -3.31483621601411, 
## -4.29035944614806, -4.43965574751052, -4.54407508662828, -3.76317205869579, 
## -3.85328251040555, -4.58831306892167, -4.82743911989164, -2.42950978656771, 
## -6.09593656870469, -5.0570981685066, -3.67418790537095, -3.04513262384912, 
## -3.28929828916475, -4.27227577071476, -4.42284862919414, -4.69061914782057, 
## -4.74397258886855, -4.73573455600921, -3.811273102328, -1.99952191850396, 
## -4.02295456613543, -5.00028900204211, -2.97162519821076, -3.39680703336086, 
## -2.93821836735338, -3.29737805302316, -4.13704331675522, -4.02911877730439, 
## -3.66165346786034, -2.97182045350768, -3.47119075348269, -3.95754352072996, 
## -4.38282695484465, -4.19770707521725, -3.49298377729286, -4.39328982695259, 
## -4.05878438682355, -3.58560110466043, -2.45865427864393, -3.29413830936875, 
## -3.37028027902743, -3.4673371841667, -3.45175410527931, -3.63287726399865, 
## -4.07748744515568, -3.76619255721661, -3.74059388980062, -4.50623023813319, 
## -4.2488953220707, -3.43983426527707, -3.77487316728091, -3.25502127150455, 
## -3.90803098415861, -4.10621473479259, -4.98892272719061, -3.88149380039332, 
## -4.56594947283481, -4.28091513330547, -3.55330015987211, -3.46317917401804, 
## -3.5596071184098, -4.38362791604086, -5.31241629092754, -2.82784776455781, 
## -3.42589999430253, -4.27874828522038, -2.88383318105658, -4.39978335626314, 
## -4.5421953868267, -2.98439676389025, -4.01738352108597, -3.61972659542562, 
## -3.04387264888193, -3.08172598672097, -2.81424439750861, -3.6667266494728, 
## -4.14901196356441, -4.50442028288795, -2.81658240792567, -4.69893156580257, 
## -4.4178610876831, -4.21448036346208, -3.3295280949718, -3.7272045684356, 
## -3.82676316147732, -3.03968416108317, -5.59672340236279, -4.49005737888759, 
## -3.00376444525126, -4.54031921366848, -4.44902150349816, -4.20706543228622, 
## -4.71075294856316, -4.497213044483, -5.24496619105104, -4.17273862965011, 
## -4.23913914712533, -3.81716747441547, -3.35785057715879, -4.82731423696922, 
## -2.96500910091115, -4.30877617293429, -2.49350349701352, -4.27227577071476, 
## -2.24054970207459, -3.85611537348985, -3.91052412930441, -4.19970507787993, 
## -4.43965574751052, -4.40549999085952, -4.28526296624157, -4.23844590619636, 
## -3.39055422156368, -3.05082223987195, -4.67109633610426, -3.62985611366448, 
## -3.75630213043137, -5.32199549390261, -5.58706667496024, -5.35785507926423, 
## -4.69838256771027, -5.11250192049983, -4.31250057202527, -4.26655738486777, 
## -5.80515096904449, -4.09895517477978, -2.71961683747368, -4.52913549971209, 
## -3.15472800504445, -4.28598944647694, -3.90405483577897, -3.71686749065411, 
## -4.25592275787816, -4.97811970241958, -3.87136102370926, -3.28181562464207, 
## -3.42805483583204, -3.42497791032756, -4.50623023813319, -4.94428591355472, 
## -4.77572128718313, -3.24496315963152, -4.23429652264955, -3.92662915781751, 
## -3.48904547223535, -3.36072744910844, -3.97549542838372, -4.512591004695, 
## -3.06379700446812, -4.51350299746227, -4.76205794152524, -4.96313167145229, 
## -2.72433203367649, -3.38522561215246, -3.23602198370317, -4.83596795623916, 
## -4.71253270431659, -5.36168324216986, -3.88977239649333, -4.01460959420325, 
## -4.38202663467388, -3.96016338075608, -4.27730632390348, -4.4990099901597, 
## -3.56807875396823, -3.2398441313044, -3.62122080564012, -3.9659513474537, 
## -4.14332474444382, -4.0107389783673, -2.57137998972734, -4.42035174899555, 
## -3.51190550364658, -3.09026275740022, -3.96173859767408, -3.0878475624618, 
## -3.87617335253445, -4.05013630755778, -4.25804065489289, -2.70770026233792, 
## -3.19491518742336, -4.29401575735117, -3.92916916426312, -3.53085058988903, 
## -3.28983491380869, -4.11597686234921, -4.36379386641282, -3.93375749757415, 
## -4.22058828879635, -3.93018697605582, -3.29198429668884, -4.41952083910146, 
## -3.29333001079315, -3.84717203310853, -3.1055471395612, -3.58560110466043, 
## -3.76792266145439, -4.67184431925608, -3.62159470730834, -3.57269769982453, 
## -4.24959584749339, -4.08995421392081, -4.37168034261974, -4.66258745685804, 
## -3.88879487930894, -2.94808562008664, -3.90952612522956, -3.97124236508812, 
## -5.16781555848388, -3.25269117336293, -3.84063300934147, -5.07309597543886, 
## -4.44475346458219, -2.77852631490653, -3.41276401803362, -3.60637823260219, 
## -3.19711433308669, -4.09534506255568, -4.02239606290952, -4.29621597826077, 
## -4.34665949083659, -4.51441582271963, -4.68442986697372, -3.14469642669395, 
## -4.46020441573791, -4.31923964657512, -4.16048436472665, -3.90455099058945, 
## -4.44731210137251, -3.88830647881083, -3.87520903230543, -2.8029654588985, 
## -3.9481684520645, -3.65389848521869, -4.76311137405667, -3.72222943379549, 
## -4.81811591232586, -4.71342377142835, -4.36615328551759, -2.92359767579479, 
## -4.08697180789269, -4.07395387257437, -2.70950125336952, -3.44860350405943, 
## -4.37644225637998, -2.65569497076828, -3.44014613180073, -3.91753818611626, 
## -4.16369464035689, -4.03419063940235, -3.47602869228516, -3.85658829854005, 
## -2.9058915695542, -4.09474464224344, -4.5263590055638, -4.80532990091193, 
## -4.00633368489939, -2.59414132699443, -4.27371049119042, -3.56383396513971, 
## -4.34897878062768, -4.23636906225152, -3.61191841297781, -4.60217467700829, 
## -3.88635525867957, -4.28962978540791, -4.28962978540791, -3.57877023143391, 
## -4.00688328645211, -3.39442039738313, -3.09180274230472, -3.0183869641188, 
## -3.81762233000666, -4.42035174899555, -3.00780485478826, -3.24009946131983, 
## -4.42535175941225, -3.78759542705309, -4.0107389783673, -4.11659017116942, 
## -4.10986437388014, -3.97549542838372, -3.77095756514954, -3.75973119508589, 
## -4.1598435444548, -3.43827638990358, -5.13280292807046, -3.7937956240635, 
## -4.0387206584741, -4.23844590619636, -4.46280294470117, -4.00908471827128, 
## -4.05878438682355, -4.92979290625216, -3.47732277205165, -3.38788636760248, 
## -3.35383681030232, -4.63645447602786, -3.19126065783523, -3.06765801312558, 
## -4.44645849483327, -3.8295217839164, -3.8637091451496, -4.92716768885499, 
## -3.47894273028701, -4.48916651023178, -3.9792317551216, -4.50623023813319, 
## -3.48970078025356, -4.39571996180588, -3.99867081215382, -3.83830796760587, 
## -3.59867318622779, -3.80676249477065, -4.48827643451659, -3.07067136021848, 
## -3.62009993871911, -4.72417897257029, -3.02742936029725, -3.14423228187243, 
## -3.54356772295388, -3.7201637441537, -2.78741813648578), texture_worst = c(3.84564929607836, 
## 4.39399418633571, 4.55828921539398, 4.62984238922248, 3.77722283485522, 
## 4.4211241551246, 4.71271039433389, 4.91933405670256, 5.49170795573549, 
## 5.1148322519163, 4.68587539087759, 4.71271039433389, 5.00062492188965, 
## 4.92899890199126, 4.9672866665904, 4.03444047673929, 4.14699403807913, 
## 3.66818867730445, 4.01749018754573, 4.9723474900504, 4.22683509877394, 
## 5.07450645475586, 4.68445475787559, 5.2784323363665, 4.05870187270786, 
## 4.74480308344018, 5.00561868253273, 4.93028470867799, 4.68445475787559, 
## 4.80639733438528, 4.3453390314034, 4.53345042416724, 4.59470138513544, 
## 4.88815077944616, 5.07207842375622, 4.86450250302232, 4.27462740540287, 
## 4.16566697228562, 4.98872460173249, 4.57247403882814, 4.37627106269926, 
## 4.22079093808333, 4.98054948250216, 4.31731157691913, 4.74618886322936, 
## 4.91739656048269, 4.29899485449732, 3.63921234008317, 4.66877250239633, 
## 4.83929193082027, 4.03162425409815, 5.08540372897595, 4.9723474900504, 
## 4.42825357700912, 5.11124712616235, 4.17979251337981, 4.37788780062843, 
## 4.48452699202674, 3.28480883728916, 4.00136370799379, 4.98243842645474, 
## 4.50452373965997, 5.0099800597908, 5.09925984037004, 5.04460038283339, 
## 4.51064286511708, 4.71411457400721, 4.82189254351365, 4.89858891132473, 
## 3.93665456449412, 4.81247233739534, 4.5813897395165, 4.58879422265363, 
## 4.4589011555627, 4.97424265314624, 4.18506724920057, 3.82822637799617, 
## 4.92771242891074, 5.19649935898102, 4.06055740376743, 4.38595481282061, 
## 4.31149915094838, 4.70074195576864, 4.81516751663526, 4.23286346360635, 
## 4.55379225305673, 4.41953690473458, 4.34041740713414, 4.06796436841826, 
## 3.81894651195118, 4.69225794719606, 4.90444106677773, 4.72462012983057, 
## 3.70223861716733, 4.51064286511708, 3.70332816754794, 4.40759828979585, 
## 5.21780342847701, 4.57321828343664, 3.89411597678992, 4.59322595812653, 
## 4.97991951661426, 4.03350212602557, 4.96158092693842, 4.55454233489475, 
## 4.27968997796331, 3.6803321374997, 3.4881652680553, 4.53874091045711, 
## 4.64866495826235, 4.26277156096358, 4.16743743155859, 3.76030887491783, 
## 4.55379225305673, 4.44951342604587, 4.85325597503937, 4.05591572233258, 
## 3.73790915557037, 4.14788668902207, 4.53496319009936, 4.52663106781278, 
## 4.74133518110845, 3.95325104412725, 3.93273153639511, 4.60867288259549, 
## 4.62183449590202, 4.78331042054293, 4.75172411533717, 4.57990609284585, 
## 4.61526269721601, 5.02054044452534, 4.36085615277429, 3.66597344862832, 
## 4.86318204216207, 3.81584452507305, 3.79296212815746, 4.02880451447434, 
## 3.81584452507305, 4.47136044125528, 4.76756833954403, 4.80639733438528, 
## 3.72176758750256, 5.05256937888436, 5.0908347519036, 5.05073262206906, 
## 3.88310217785028, 4.20778553942227, 4.61964589127126, 4.22597261650997, 
## 4.62838838909742, 4.06518959300002, 4.40120616682715, 4.17715059261016, 
## 4.72112332197546, 4.27800368129042, 4.49531539967184, 5.00374719065992, 
## 4.5813897395165, 5.04214328851219, 5.55137593525295, 4.49069780205868, 
## 4.12456366980809, 4.37869575168726, 4.52207395926043, 3.69678266390047, 
## 4.51750811014081, 4.74133518110845, 5.1720985907114, 4.66089306552781, 
## 4.47756557994607, 4.78057993995547, 5.7250741812419, 4.89076444551659, 
## 4.40919360925164, 3.80131055253282, 4.08454211051463, 4.95649816088697, 
## 4.88160449712241, 4.50987883531407, 5.10705804269966, 5.54784383979588, 
## 5.31568005236977, 4.02503937366727, 4.89011128798296, 4.60573826407968, 
## 5.69944420254844, 4.01465271367111, 4.73368836607226, 4.61818571423201, 
## 4.90638871527655, 4.72392117056727, 3.88310217785028, 5.19186958268757, 
## 4.20865485453779, 4.66591022465882, 4.31648211802796, 4.56277773074029, 
## 4.52511300047923, 4.48529922188488, 5.25367423284224, 4.33301546451814, 
## 5.91342843222292, 5.41210504547204, 4.97928939162397, 4.82725933452894, 
## 4.35596712356414, 4.87438349362812, 4.41238100619215, 4.53420692714647, 
## 4.16123482951012, 4.74895806068259, 4.00896702468464, 4.00896702468464, 
## 4.4643601672029, 4.60059403590832, 4.25596944489609, 5.23266824566795, 
## 3.93861349816864, 4.78126284597665, 4.52511300047923, 4.36735901153395, 
## 4.68658539494554, 3.65486318971188, 4.30567217570052, 4.60353494442031, 
## 4.28894276917322, 4.60720602154206, 4.09644010087559, 4.57470607871923, 
## 3.86090927635756, 3.8044330617929, 3.57313465469149, 4.39238851250693, 
## 4.53798584507042, 4.61087149610063, 4.36654714126594, 4.83395137977617, 
## 4.48298177738099, 4.24487317517505, 4.97613637696856, 4.14341994534426, 
## 3.80547291455154, 3.85278371543535, 4.63347355254343, 4.25682080339513, 
## 3.74860430284226, 3.33461827035041, 4.31482231351887, 4.07626837901377, 
## 3.75605990535429, 4.51826969357939, 4.39319148665841, 4.12185715449991, 
## 4.15768292970039, 4.36329684217773, 4.19907427922337, 4.99123464718118, 
## 4.22510981161163, 4.20081917204225, 3.91301228983002, 4.8966351137691, 
## 4.53420692714647, 4.27631615777365, 4.27378256776766, 4.57693603923099, 
## 4.28305889879237, 4.76825489106129, 5.07086352975503, 3.80235188130695, 
## 5.14392222535321, 4.65442707939607, 4.91933405670256, 4.23630118715586, 
## 4.36654714126594, 4.09369993366443, 5.16098279528589, 4.05498624312565, 
## 4.67020237121107, 3.90606913806289, 3.88510875184718, 4.07902964158111, 
## 4.45421195165986, 4.40759828979585, 5.0751130969265, 3.68582950234986, 
## 4.50682025822858, 4.56203022993144, 3.88310217785028, 4.51979212962875, 
## 4.19032989622769, 4.81853233316707, 4.55229138231215, 4.62037564745409, 
## 5.11064911059594, 4.62547775120469, 4.5784215265119, 4.93862626226538, 
## 3.67592419710848, 4.26446897202882, 4.37465320891252, 4.12185715449991, 
## 4.02315444464448, 4.34615829494115, 5.21722997447023, 4.06981233359486, 
## 5.05073262206906, 4.15056255337101, 4.37141414682473, 3.84972892391568, 
## 4.98180893733216, 4.00326694607662, 4.90054116752379, 3.66708134606272, 
## 4.34451948048039, 4.83261442759844, 4.78740043468991, 4.5784215265119, 
## 4.54401968563775, 4.68018785002482, 4.19732806150465, 4.13446018295962, 
## 4.4791143283873, 4.11552943063158, 4.36248356180047, 3.79609650233605, 
## 4.56203022993144, 4.0642639124341, 4.66662611195927, 4.54928680787707, 
## 4.7969173026623, 5.25649971528638, 4.62547775120469, 4.73577624450677, 
## 4.97171544875428, 5.289607608586, 4.68089952389297, 4.7120079998809, 
## 3.80339271755236, 4.66447781347784, 3.94643210929488, 4.07258146061223, 
## 4.55604179233441, 4.37465320891252, 4.80098466524709, 4.24144819574989, 
## 4.12906740530397, 4.18857701990403, 4.70638151598143, 4.35351877174842, 
## 4.52359396475879, 4.15145381301131, 4.92384900352362, 4.47058365452776, 
## 4.24230491665047, 4.73716716909308, 4.0391263923702, 4.6565843327947, 
## 3.86293601702622, 4.89533172440296, 5.3789243482449, 4.33959612712433, 
## 4.76344497143527, 5.00437117689173, 4.74895806068259, 4.93413813813035, 
## 5.3431303609114, 5.53924614207584, 5.39342611034402, 5.13564484000889, 
## 4.99248872462087, 5.15099568604474, 4.44244806790354, 4.43220506774196, 
## 4.55604179233441, 4.55304193558474, 4.48915660389326, 4.79623875179676, 
## 4.54853507309582, 4.56501883193107, 3.93469391170234, 5.35239743682106, 
## 4.04286814109814, 4.04286814109814, 4.68445475787559, 4.26107290477726, 
## 4.26955375498823, 4.36492255576514, 4.72741395924518, 4.62402111104245, 
## 3.9444800183149, 3.50017120752803, 4.19470624578159, 4.55154059307216, 
## 4.89728655105185, 4.63419912862702, 4.99937492186035, 3.95422348371184, 
## 4.76825489106129, 4.76962741533791, 4.35106785374955, 4.92127004033648, 
## 4.22683509877394, 4.43062526611781, 3.74646945527153, 3.91697011212309, 
## 4.16920652420897, 5.25706443832225, 3.87304201366326, 3.87102449594356, 
## 4.84927432396436, 3.83644251728315, 4.74064100771146, 4.36329684217773, 
## 4.60793956404291, 4.5357192131972, 4.4698066128653, 4.25596944489609, 
## 3.89711029325023, 4.66089306552781, 4.62474954134985, 4.56576539912967, 
## 4.03537843777246, 3.63696722222788, 3.6803321374997, 4.12095426849246, 
## 4.63056906010227, 4.55454233489475, 5.01184689693292, 4.93156984974714, 
## 4.99311552733089, 4.07350375212876, 4.98306775683589, 5.01805965188487, 
## 4.4991565479351, 4.25852258139477, 4.56277773074029, 4.75448703520705, 
## 5.2383625553961, 4.52207395926043, 5.22353090036469, 5.13623697910506, 
## 4.68516517856677, 5.30350877738352, 5.07207842375622, 5.3659655010133, 
## 5.59835498011832, 4.83261442759844, 4.62256358846899, 5.36325756817213, 
## 5.12912215905078, 5.42589455648472, 4.89598350492003, 3.84564929607836, 
## 4.39399418633571, 4.55828921539398, 4.62984238922248, 3.77722283485522, 
## 4.71271039433389, 4.74618886322936, 5.49170795573549, 5.1148322519163, 
## 4.86780056544559, 5.00062492188965, 5.30184459476561, 4.92899890199126, 
## 4.9672866665904, 4.92899890199126, 3.66818867730445, 4.01749018754573, 
## 5.21493487007181, 4.9723474900504, 4.22683509877394, 5.07450645475586, 
## 4.68445475787559, 5.2784323363665, 4.05870187270786, 5.12258263348064, 
## 5.00561868253273, 4.68445475787559, 4.80639733438528, 4.89533172440296, 
## 4.3453390314034, 4.53345042416724, 4.59470138513544, 4.88815077944616, 
## 5.07207842375622, 4.73647180615168, 4.86450250302232, 4.2199261916611, 
## 4.27462740540287, 4.16566697228562, 4.57247403882814, 4.37627106269926, 
## 4.22079093808333, 4.45108063957035, 4.98054948250216, 4.74618886322936, 
## 4.91739656048269, 3.63921234008317, 3.85786568417726, 4.66877250239633, 
## 4.03162425409815, 5.09023188209693, 5.08540372897595, 4.9723474900504, 
## 4.42825357700912, 4.33219157542828, 4.080868618771, 4.63564962539988, 
## 3.66929544300413, 5.11124712616235, 4.17979251337981, 4.37788780062843, 
## 4.48452699202674, 4.00136370799379, 4.98243842645474, 5.0099800597908, 
## 4.37627106269926, 5.09925984037004, 5.04460038283339, 4.51064286511708, 
## 4.71411457400721, 4.82189254351365, 4.90444106677773, 3.93665456449412, 
## 4.81247233739534, 4.5813897395165, 4.58879422265363, 4.4589011555627, 
## 4.18506724920057, 4.61453138729751, 3.82822637799617, 4.92771242891074, 
## 5.19649935898102, 4.06055740376743, 5.05195727576962, 4.65370756626545, 
## 4.38595481282061, 4.31149915094838, 4.8225640340883, 4.73716716909308, 
## 4.23286346360635, 4.55379225305673, 4.41953690473458, 4.34041740713414, 
## 4.06796436841826, 3.81894651195118, 4.90444106677773, 3.70223861716733, 
## 4.51064286511708, 4.44008824220022, 3.70332816754794, 4.34041740713414, 
## 5.21780342847701, 4.57321828343664, 3.94545627679901, 5.09685554064732, 
## 3.89411597678992, 4.97991951661426, 4.03350212602557, 5.08419524488049, 
## 4.27968997796331, 4.22942061247119, 3.4881652680553, 4.53874091045711, 
## 4.64866495826235, 4.26277156096358, 4.16743743155859, 4.62110518228067, 
## 4.55379225305673, 4.08912563560772, 4.31648211802796, 4.44951342604587, 
## 4.85325597503937, 4.05591572233258, 3.73790915557037, 4.14788668902207, 
## 4.53496319009936, 4.74133518110845, 3.95325104412725, 3.93273153639511, 
## 4.60867288259549, 3.89411597678992, 4.62183449590202, 4.78331042054293, 
## 4.57990609284585, 3.221496909402, 4.61526269721601, 5.02054044452534, 
## 4.36085615277429, 3.66597344862832, 4.86318204216207, 3.81584452507305, 
## 3.79296212815746, 4.02880451447434, 3.81584452507305, 4.47136044125528, 
## 4.80639733438528, 3.72176758750256, 5.05256937888436, 5.0908347519036, 
## 3.88310217785028, 4.7385572993759, 4.20778553942227, 4.61964589127126, 
## 4.22597261650997, 4.62838838909742, 4.06518959300002, 5.30461763543518, 
## 4.40120616682715, 4.17715059261016, 4.72112332197546, 5.12258263348064, 
## 4.49531539967184, 5.00374719065992, 4.8796372255157, 4.76138067250417, 
## 4.5813897395165, 5.04214328851219, 5.55137593525295, 4.49069780205868, 
## 4.37869575168726, 4.52207395926043, 4.81584084911856, 3.69678266390047, 
## 4.51750811014081, 3.95907935613536, 4.74133518110845, 5.1720985907114, 
## 4.66089306552781, 4.47756557994607, 4.78057993995547, 5.7250741812419, 
## 3.74326327059181, 3.82513742469316, 4.89076444551659, 4.40919360925164, 
## 3.80131055253282, 4.0391263923702, 4.95649816088697, 4.88160449712241, 
## 4.50987883531407, 5.10705804269966, 5.54784383979588, 5.31568005236977, 
## 4.02503937366727, 4.89011128798296, 5.15217268134405, 4.60573826407968, 
## 5.48447656291417, 4.01465271367111, 4.1362549488282, 4.73368836607226, 
## 4.90638871527655, 4.83595544584434, 3.88310217785028, 4.20865485453779, 
## 4.31648211802796, 4.56277773074029, 4.48529922188488, 4.28390036534887, 
## 5.25367423284224, 4.33301546451814, 4.977398061134, 5.91342843222292, 
## 5.41210504547204, 4.96665334045321, 4.68303329080159, 4.97928939162397, 
## 4.82725933452894, 5.80649267996141, 4.35596712356414, 4.87438349362812, 
## 4.41238100619215, 4.53420692714647, 4.16123482951012, 3.72500546771211, 
## 4.74895806068259, 4.18155210571024, 5.16274115651175, 4.00896702468464, 
## 4.00896702468464, 4.4643601672029, 4.60059403590832, 4.25596944489609, 
## 3.93861349816864, 4.78126284597665, 4.52511300047923, 4.36735901153395, 
## 4.68658539494554, 3.65486318971188, 4.30567217570052, 4.60353494442031, 
## 4.28894276917322, 4.60720602154206, 4.09644010087559, 4.57470607871923, 
## 3.86090927635756, 3.57313465469149, 4.53798584507042, 4.45812026715483, 
## 4.36654714126594, 4.83395137977617, 4.48298177738099, 4.24487317517505, 
## 4.97613637696856, 4.10918408776423, 3.80547291455154, 3.85278371543535, 
## 4.63347355254343, 4.25682080339513, 3.33461827035041, 4.31482231351887, 
## 4.07626837901377, 4.51826969357939, 4.39319148665841, 4.12185715449991, 
## 4.15768292970039, 4.36329684217773, 4.19907427922337, 4.99123464718118, 
## 4.22510981161163, 4.29899485449732, 4.8966351137691, 4.53420692714647, 
## 4.27631615777365, 4.27378256776766, 4.57693603923099, 4.28305889879237, 
## 4.76825489106129, 5.14392222535321, 4.65442707939607, 4.91933405670256, 
## 4.23630118715586, 4.36654714126594, 4.09369993366443, 3.92880160458909, 
## 4.05498624312565, 4.67020237121107, 4.1736232924066, 4.07902964158111, 
## 4.45421195165986, 3.68582950234986, 4.50682025822858, 4.56203022993144, 
## 4.51979212962875, 4.19032989622769, 4.81853233316707, 4.55229138231215, 
## 4.2525608748516, 4.62037564745409, 4.44401997544328, 4.62547775120469, 
## 4.5784215265119, 4.93862626226538, 4.26446897202882, 4.37465320891252, 
## 4.12185715449991, 4.02315444464448, 4.34615829494115, 5.21722997447023, 
## 4.06981233359486, 5.05073262206906, 4.17715059261016, 4.78535628237399, 
## 3.84972892391568, 4.98180893733216, 4.10008852884187, 4.00326694607662, 
## 4.90054116752379, 3.66708134606272, 4.34451948048039, 4.83261442759844, 
## 4.78740043468991, 4.5784215265119, 4.54401968563775, 4.68018785002482, 
## 4.13446018295962, 4.4791143283873, 4.72112332197546, 4.11552943063158, 
## 4.48838562918339, 4.56203022993144, 4.66662611195927, 4.54928680787707, 
## 4.7969173026623, 4.62547775120469, 4.73577624450677, 4.97171544875428, 
## 5.289607608586, 4.95204221081654, 3.80339271755236, 4.79962962791266, 
## 4.66447781347784, 3.94643210929488, 4.07258146061223, 4.37465320891252, 
## 4.80098466524709, 4.24144819574989, 5.00249875093668, 4.12906740530397, 
## 4.18857701990403, 4.70638151598143, 4.35351877174842, 4.52359396475879, 
## 4.93221217091729, 4.15145381301131, 4.92384900352362, 4.47058365452776, 
## 4.24230491665047, 4.73716716909308, 4.0391263923702, 5.20746150097278, 
## 3.86293601702622, 4.38595481282061, 4.27968997796331, 5.3789243482449, 
## 4.33959612712433, 4.76344497143527, 5.00437117689173, 4.74895806068259, 
## 4.93413813813035, 5.3431303609114, 4.29062114000629, 5.39342611034402, 
## 5.13564484000889, 4.99248872462087, 5.29016509938456, 4.96031120783185, 
## 4.99874968738276, 4.44244806790354, 4.55604179233441, 4.55304193558474, 
## 4.48915660389326, 4.79623875179676, 4.54853507309582, 4.56501883193107, 
## 5.0817765373419, 3.93469391170234, 4.04286814109814, 4.26955375498823, 
## 4.36492255576514, 4.72741395924518, 4.62402111104245, 4.40360525003346, 
## 3.50017120752803, 4.19470624578159, 4.55154059307216, 4.89728655105185, 
## 4.24316132001179, 4.63419912862702, 4.59764951947146, 3.74433252136007, 
## 4.76825489106129, 4.76962741533791, 4.46747395638393, 4.92127004033648, 
## 4.14163080129856, 4.43062526611781, 3.74646945527153, 3.91697011212309, 
## 4.45421195165986, 3.88911647055105, 5.25706443832225, 3.87304201366326, 
## 3.87102449594356, 4.84927432396436, 3.83644251728315, 4.74064100771146, 
## 4.36329684217773, 4.60793956404291, 4.5357192131972, 4.4698066128653, 
## 4.25596944489609, 4.66089306552781, 4.62474954134985, 4.56576539912967, 
## 3.95713827061974, 4.53042200855463, 3.6803321374997, 4.73508048403053, 
## 4.7921634933305, 4.12095426849246, 4.67662633771017, 4.63056906010227, 
## 5.01184689693292, 4.93156984974714, 4.07350375212876, 4.98306775683589, 
## 5.01805965188487, 4.4991565479351, 4.81112363659015, 4.25852258139477, 
## 4.28137504979732, 4.56277773074029, 4.96538620503109, 4.5006912691636, 
## 4.75448703520705, 5.2383625553961, 4.52207395926043, 5.22353090036469, 
## 5.17559876883805, 4.35106785374955, 5.13623697910506, 4.68516517856677, 
## 5.07207842375622, 5.3659655010133, 5.59835498011832, 4.83261442759844, 
## 4.62256358846899, 5.36325756817213, 5.42589455648472), smoothness_worst = c(-1.40183650096854, 
## -1.55220612193591, -1.46803241840246, -1.24682371794616, -1.49563269960508, 
## -1.34354252185399, -1.46880785019902, -1.3733916940331, -1.3231236019111, 
## -1.57721525174517, -1.48685434191659, -1.59985855657163, -1.39154061904023, 
## -1.46031914038324, -1.34420937914042, -1.46958403520788, -1.52091312537128, 
## -1.51595585913422, -1.48923877900503, -1.33888905569403, -1.42981400758559, 
## -1.43724005615725, -1.51021197356396, -1.39507651854449, -1.54490367636292, 
## -1.39649510187451, -1.39756063251601, -1.44323013407429, -1.46725773804559, 
## -1.42318788167546, -1.67785425471799, -1.6941150453434, -1.40613457096515, 
## -1.61706993017923, -1.54833143908053, -1.44548787503347, -1.52715454306599, 
## -1.44888634599264, -1.58573917754001, -1.62131793415638, -1.61942689900374, 
## -1.59390494932646, -1.48963688528217, -1.54747305858427, -1.40112232198728, 
## -1.49804385635261, -1.65226073987208, -1.36309704347488, -1.39578550727826, 
## -1.39543093725709, -1.67097631500598, -1.39259981106491, -1.42870591413312, 
## -1.53008450627828, -1.41516143031734, -1.48092447390449, -1.57944902805708, 
## -1.44661911204314, -1.46532431312868, -1.4549636269214, -1.39578550727826, 
## -1.53050398030052, -1.42539062731438, -1.4195297558875, -1.48844316751003, 
## -1.49442988431315, -1.48606112392627, -1.52340366783141, -1.54747305858427, 
## -1.53555552814175, -1.60725220038208, -1.5440491267089, -1.50980330884167, 
## -1.42759932289029, -1.57321108034171, -1.5957317618608, -1.47308623661435, 
## -1.47308623661435, -1.54064041483767, -1.41552457144521, -1.56045122596102, 
## -1.32019949774685, -1.57587816190535, -1.37477388194629, -1.45916852210433, 
## -1.53134361222597, -1.71744592147873, -1.52590223262381, -1.36617225167842, 
## -1.57810796637572, -1.31502542614399, -1.32247294325797, -1.62749770507757, 
## -1.42870591413312, -1.44548787503347, -1.52049881052667, -1.58799851382772, 
## -1.45763692373978, -1.65226073987208, -1.47464772083503, -1.42944447588348, 
## -1.48725124924937, -1.55870797763166, -1.45993541772284, -1.53809377073995, 
## -1.57321108034171, -1.49844643513038, -1.55393453399349, -1.52008471650699, 
## -1.48013716393439, -1.52091312537128, -1.62559067609645, -1.47582084584847, 
## -1.6642143643724, -1.52715454306599, -1.39898347234133, -1.52924624061376, 
## -1.50898661988177, -1.4549636269214, -1.56132435202346, -1.47817229601729, 
## -1.72699126094039, -1.53640067966101, -1.5816894624428, -1.50207895410352, 
## -1.47036097520833, -1.46493818798572, -1.55653452068464, -1.64050222044834, 
## -1.5830369390616, -1.49483061874586, -1.56176129175274, -1.49123131965263, 
## -1.48487278500486, -1.43500511941966, -1.56176129175274, -1.62559067609645, 
## -1.58573917754001, -1.52548524538613, -1.47935063364482, -1.76360044982201, 
## -1.58573917754001, -1.4572544797648, -1.45002235376791, -1.47974380144674, 
## -1.60447093386092, -1.43873343573348, -1.55393453399349, -1.51636776051741, 
## -1.50288846770294, -1.61612925391863, -1.69406265936793, -1.82475510742266, 
## -1.51925718980053, -1.61565936633134, -1.72435994432552, -1.51677987989553, 
## -1.4928289806949, -1.43314735095279, -1.20942243559218, -1.47503857052102, 
## -1.45002235376791, -1.47935063364482, -1.60911220342376, -1.55914341511068, 
## -1.53428953643525, -1.5500510954301, -1.42465571958716, -1.46185586833818, 
## -1.68681869997878, -1.50410430665766, -1.48210690487441, -1.42723079211313, 
## -1.53597798799109, -1.52715454306599, -1.50898661988177, -1.55696871583034, 
## -1.34822152100929, -1.3733916940331, -1.70382130009779, -1.66300999675367, 
## -1.55870797763166, -1.44511114728777, -1.53134361222597, -1.55653452068464, 
## -1.49925221154687, -1.53513329980447, -1.50654165744372, -1.69511112194495, 
## -1.46070304656148, -1.6137828087638, -1.54533130870895, -1.44699654296734, 
## -1.48963688528217, -1.56394977145742, -1.57632359474443, -1.37097832354581, 
## -1.47895766029479, -1.44775193389517, -1.5338680007895, -1.35225267363822, 
## -1.31341485905556, -1.43835983334882, -1.62179145524877, -1.42502309090337, 
## -1.56307362195144, -1.6602076580791, -1.54447628247046, -1.5114392519592, 
## -1.73845567122268, -1.53767014868934, -1.50167451022945, -1.54319552829131, 
## -1.5500510954301, -1.63807637754432, -1.57543299233561, -1.34654749739611, 
## -1.6448892914313, -1.43985527553856, -1.49925221154687, -1.64391208808319, 
## -1.60678792964066, -1.68221925036159, -1.64881136280715, -1.60586026106885, 
## -1.6635617469794, -1.52049881052667, -1.4530595505586, -1.49764148352008, 
## -1.52673688116944, -1.58213835253834, -1.69805517741032, -1.59710481795407, 
## -1.69108309352359, -1.44850803159816, -1.63807637754432, -1.50654165744372, 
## -1.48013716393439, -1.59390494932646, -1.68588605530976, -1.60123926203061, 
## -1.7134488980851, -1.65473472907519, -1.55870797763166, -1.65721725562206, 
## -1.57143825131019, -1.61801181100177, -1.50654165744372, -1.61237852008231, 
## -1.66220820932499, -1.4719171273346, -1.55957910209346, -1.74733746712263, 
## -1.48447706788501, -1.63614151870987, -1.42907511146058, -1.41262392240711, 
## -1.54447628247046, -1.49163043179358, -1.5338680007895, -1.41334812638161, 
## -1.47503857052102, -1.47152780472688, -1.53092368218281, -1.47582084584847, 
## -1.58393660330422, -1.55914341511068, -1.4660971225983, -1.59481779571206, 
## -1.44737415018464, -1.48447706788501, -1.44097866837128, -1.52840888279519, 
## -1.55436724964736, -1.47817229601729, -1.56088766354308, -1.50127027462458, 
## -1.53050398030052, -1.56001503892749, -1.52882744837297, -1.66481730933909, 
## -1.42796802003095, -1.43537718258408, -1.38837110473461, -1.72206575085892, 
## -1.50817078309823, -1.58981090884964, -1.65523054910995, -1.51225851047311, 
## -1.69905686874236, -1.5957317618608, -1.51021197356396, -1.5147214601588, 
## -1.53513329980447, -1.50005881482507, -1.52840888279519, -1.47935063364482, 
## -1.60216116098467, -1.57232414528024, -1.42355459440915, -1.61191101750064, 
## -1.55350206373273, -1.59436123275711, -1.61051028430753, -1.53640067966101, 
## -1.22152482402453, -1.40613457096515, -1.47464772083503, -1.58393660330422, 
## -1.52091312537128, -1.61331441525618, -1.5466156397054, -1.50817078309823, 
## -1.46185586833818, -1.3810220723387, -1.44586477827542, -1.47269634305255, 
## -1.63036761017151, -1.47425706282928, -1.68645585838021, -1.60493375343096, 
## -1.54962081859927, -1.49563269960508, -1.58124084060449, -1.63904574231491, 
## -1.50005881482507, -1.56614458542139, -1.6933296655119, -1.45344000592858, 
## -1.55134338153222, -1.46455224894641, -1.49764148352008, -1.620844718463, 
## -1.62511469446343, -1.59208260255658, -1.47503857052102, -1.44022956699973, 
## -1.52008471650699, -1.54747305858427, -1.52091312537128, -1.47935063364482, 
## -1.59253777203273, -1.48606112392627, -1.60216116098467, -1.5147214601588, 
## -1.64782884752383, -1.62940972163096, -1.47777990462435, -1.46571062459338, 
## -1.38661545837086, -1.56088766354308, -1.40505795674331, -1.5626359263642, 
## -1.57099569380803, -1.60586026106885, -1.64586781621825, -1.48884087316261, 
## -1.62749770507757, -1.52132766132703, -1.48487278500486, -1.55740315875384, 
## -1.61942689900374, -1.49804385635261, -1.69045742134115, -1.43612181989407, 
## -1.56526589687447, -1.58124084060449, -1.48289617228522, -1.53050398030052, 
## -1.56088766354308, -1.40973345879038, -1.50857859510293, -1.53344669542821, 
## -1.5683457783542, -1.50491591473139, -1.4900351922394, -1.56570511374899, 
## -1.34721671179882, -1.55393453399349, -1.6308470262566, -1.69228385560889, 
## -1.51266846242613, -1.51513270923389, -1.58981090884964, -1.69427222671242, 
## -1.50451000564169, -1.49442988431315, -1.56526589687447, -1.63132675755449, 
## -1.48804566180285, -1.43129381087306, -1.57989658074747, -1.59481779571206, 
## -1.44135347866954, -1.65821267277101, -1.54447628247046, -1.66965927489485, 
## -1.60586026106885, -1.56176129175274, -1.48053072134206, -1.46031914038324, 
## -1.41697874282747, -1.56922805174919, -1.30732186646871, -1.27470462970427, 
## -1.38766840258161, -1.36343818040496, -1.62702048148246, -1.59618916589, 
## -1.41916483956829, -1.55870797763166, -1.54876099075454, -1.4530595505586, 
## -1.44964350687541, -1.47856488116327, -1.46339554627544, -1.45610823795418, 
## -1.32410051225878, -1.50857859510293, -1.61051028430753, -1.47542961211584, 
## -1.52924624061376, -1.48764835579712, -1.43873343573348, -1.54106567708497, 
## -1.47425706282928, -1.53936603838066, -1.351243091246, -1.54447628247046, 
## -1.41117742285008, -1.50735579633476, -1.50939485770457, -1.63324884496924, 
## -1.54106567708497, -1.5321841580014, -1.56922805174919, -1.62037180769071, 
## -1.55393453399349, -1.61284631918178, -1.55696871583034, -1.62702048148246, 
## -1.64979521705898, -1.52673688116944, -1.55091237656436, -1.70042993006451, 
## -1.47856488116327, -1.48250144041222, -1.48131842185686, -1.58393660330422, 
## -1.59618916589, -1.39189353324403, -1.71490460342403, -1.40183650096854, 
## -1.55220612193591, -1.46803241840246, -1.24682371794616, -1.49563269960508, 
## -1.46880785019902, -1.39048277340616, -1.3231236019111, -1.57721525174517, 
## -1.64440052484827, -1.39154061904023, -1.38206798475272, -1.46031914038324, 
## -1.34420937914042, -1.44210361946329, -1.51595585913422, -1.48923877900503, 
## -1.48487278500486, -1.33888905569403, -1.42981400758559, -1.43724005615725, 
## -1.51021197356396, -1.39507651854449, -1.54490367636292, -1.45002235376791, 
## -1.39756063251601, -1.46725773804559, -1.42318788167546, -1.46725773804559, 
## -1.67785425471799, -1.6941150453434, -1.40613457096515, -1.61706993017923, 
## -1.54833143908053, -1.43537718258408, -1.44548787503347, -1.38171920194798, 
## -1.52715454306599, -1.44888634599264, -1.62131793415638, -1.61942689900374, 
## -1.59390494932646, -1.53428953643525, -1.48963688528217, -1.40112232198728, 
## -1.49804385635261, -1.36309704347488, -1.53640067966101, -1.39578550727826, 
## -1.67097631500598, -1.32377475234327, -1.39259981106491, -1.42870591413312, 
## -1.53008450627828, -1.45344000592858, -1.52757243043846, -1.57188106834428, 
## -1.56570511374899, -1.41516143031734, -1.48092447390449, -1.57944902805708, 
## -1.44661911204314, -1.4549636269214, -1.39578550727826, -1.42539062731438, 
## -1.43314735095279, -1.4195297558875, -1.48844316751003, -1.49442988431315, 
## -1.48606112392627, -1.52340366783141, -1.52423562829466, -1.53555552814175, 
## -1.60725220038208, -1.5440491267089, -1.50980330884167, -1.42759932289029, 
## -1.5957317618608, -1.51925718980053, -1.47308623661435, -1.47308623661435, 
## -1.54064041483767, -1.41552457144521, -1.60354616102531, -1.42465571958716, 
## -1.56045122596102, -1.32019949774685, -1.37858668131413, -1.37408250358882, 
## -1.45916852210433, -1.53134361222597, -1.71744592147873, -1.52590223262381, 
## -1.36617225167842, -1.57810796637572, -1.32247294325797, -1.42870591413312, 
## -1.44548787503347, -1.37581201847778, -1.52049881052667, -1.65028764768472, 
## -1.45763692373978, -1.65226073987208, -1.49083240917807, -1.53640067966101, 
## -1.47464772083503, -1.48725124924937, -1.55870797763166, -1.47777990462435, 
## -1.57321108034171, -1.48092447390449, -1.55393453399349, -1.52008471650699, 
## -1.48013716393439, -1.52091312537128, -1.62559067609645, -1.49123131965263, 
## -1.6642143643724, -1.51925718980053, -1.67734248825545, -1.52715454306599, 
## -1.39898347234133, -1.52924624061376, -1.50898661988177, -1.4549636269214, 
## -1.56132435202346, -1.72699126094039, -1.53640067966101, -1.5816894624428, 
## -1.50207895410352, -1.60308420688558, -1.47036097520833, -1.46493818798572, 
## -1.64050222044834, -1.47113867198634, -1.5830369390616, -1.49483061874586, 
## -1.56176129175274, -1.49123131965263, -1.48487278500486, -1.43500511941966, 
## -1.56176129175274, -1.62559067609645, -1.58573917754001, -1.52548524538613, 
## -1.76360044982201, -1.58573917754001, -1.4572544797648, -1.45002235376791, 
## -1.60447093386092, -1.52548524538613, -1.43873343573348, -1.55393453399349, 
## -1.51636776051741, -1.50288846770294, -1.61612925391863, -1.43426150312693, 
## -1.69406265936793, -1.82475510742266, -1.51925718980053, -1.36173387206049, 
## -1.72435994432552, -1.51677987989553, -1.42723079211313, -1.47386659627565, 
## -1.4928289806949, -1.43314735095279, -1.20942243559218, -1.47503857052102, 
## -1.47935063364482, -1.60911220342376, -1.50572836428609, -1.55914341511068, 
## -1.53428953643525, -1.59481779571206, -1.5500510954301, -1.42465571958716, 
## -1.46185586833818, -1.68681869997878, -1.50410430665766, -1.48210690487441, 
## -1.52132766132703, -1.49483061874586, -1.42723079211313, -1.53597798799109, 
## -1.52715454306599, -1.59756306683032, -1.55696871583034, -1.34822152100929, 
## -1.3733916940331, -1.70382130009779, -1.66300999675367, -1.55870797763166, 
## -1.44511114728777, -1.53134361222597, -1.45382064104328, -1.55653452068464, 
## -1.62321385571701, -1.53513329980447, -1.64440052484827, -1.50654165744372, 
## -1.46070304656148, -1.44097866837128, -1.54533130870895, -1.48963688528217, 
## -1.57632359474443, -1.37097832354581, -1.44775193389517, -1.44210361946329, 
## -1.5338680007895, -1.35225267363822, -1.44511114728777, -1.31341485905556, 
## -1.43835983334882, -1.55134338153222, -1.59848041238222, -1.62179145524877, 
## -1.42502309090337, -1.48487278500486, -1.56307362195144, -1.6602076580791, 
## -1.54447628247046, -1.5114392519592, -1.73845567122268, -1.50207895410352, 
## -1.53767014868934, -1.45916852210433, -1.51967084302662, -1.50167451022945, 
## -1.54319552829131, -1.5500510954301, -1.63807637754432, -1.57543299233561, 
## -1.6448892914313, -1.43985527553856, -1.49925221154687, -1.64391208808319, 
## -1.60678792964066, -1.68221925036159, -1.64881136280715, -1.60586026106885, 
## -1.6635617469794, -1.52049881052667, -1.4530595505586, -1.49764148352008, 
## -1.52673688116944, -1.69805517741032, -1.69108309352359, -1.59436123275711, 
## -1.63807637754432, -1.50654165744372, -1.48013716393439, -1.59390494932646, 
## -1.68588605530976, -1.66970988531176, -1.7134488980851, -1.65473472907519, 
## -1.55870797763166, -1.65721725562206, -1.61801181100177, -1.50654165744372, 
## -1.61237852008231, -1.4719171273346, -1.55957910209346, -1.74733746712263, 
## -1.48447706788501, -1.63614151870987, -1.42907511146058, -1.41262392240711, 
## -1.54447628247046, -1.65622321594174, -1.41334812638161, -1.47503857052102, 
## -1.47152780472688, -1.53092368218281, -1.47582084584847, -1.58393660330422, 
## -1.55914341511068, -1.44737415018464, -1.48447706788501, -1.44097866837128, 
## -1.52840888279519, -1.55436724964736, -1.47817229601729, -1.46185586833818, 
## -1.50127027462458, -1.53050398030052, -1.43500511941966, -1.66481730933909, 
## -1.42796802003095, -1.72206575085892, -1.50817078309823, -1.58981090884964, 
## -1.51225851047311, -1.69905686874236, -1.5957317618608, -1.51021197356396, 
## -1.56307362195144, -1.5147214601588, -1.49804385635261, -1.50005881482507, 
## -1.52840888279519, -1.47935063364482, -1.57232414528024, -1.42355459440915, 
## -1.61191101750064, -1.55350206373273, -1.59436123275711, -1.61051028430753, 
## -1.53640067966101, -1.22152482402453, -1.61659944171252, -1.72561952965551, 
## -1.58393660330422, -1.52091312537128, -1.7431069913647, -1.61331441525618, 
## -1.5466156397054, -1.50817078309823, -1.46185586833818, -1.3810220723387, 
## -1.44586477827542, -1.47269634305255, -1.63036761017151, -1.47425706282928, 
## -1.60493375343096, -1.54962081859927, -1.31663899762533, -1.49563269960508, 
## -1.69716013926355, -1.50005881482507, -1.6933296655119, -1.45344000592858, 
## -1.55134338153222, -1.49764148352008, -1.620844718463, -1.62511469446343, 
## -1.59208260255658, -1.43686713977629, -1.52008471650699, -1.53344669542821, 
## -1.54747305858427, -1.52091312537128, -1.47935063364482, -1.48606112392627, 
## -1.60216116098467, -1.5147214601588, -1.52465194351403, -1.64782884752383, 
## -1.62940972163096, -1.47777990462435, -1.46571062459338, -1.38661545837086, 
## -1.48923877900503, -1.56088766354308, -1.40505795674331, -1.5626359263642, 
## -1.57099569380803, -1.60586026106885, -1.64586781621825, -1.47113867198634, 
## -1.62749770507757, -1.67310899881542, -1.54149117490669, -1.48487278500486, 
## -1.55740315875384, -1.61942689900374, -1.49804385635261, -1.69045742134115, 
## -1.43612181989407, -1.56526589687447, -1.55783784979882, -1.48289617228522, 
## -1.53050398030052, -1.56088766354308, -1.60957793643666, -1.50248360650758, 
## -1.65423925053115, -1.50857859510293, -1.5683457783542, -1.50491591473139, 
## -1.4900351922394, -1.56570511374899, -1.34721671179882, -1.55393453399349, 
## -1.64244875130296, -1.6308470262566, -1.51266846242613, -1.50451000564169, 
## -1.49442988431315, -1.56526589687447, -1.63132675755449, -1.56922805174919, 
## -1.43129381087306, -1.57989658074747, -1.59481779571206, -1.44135347866954, 
## -1.43686713977629, -1.65821267277101, -1.52215739771971, -1.69139613830031, 
## -1.60586026106885, -1.56176129175274, -1.53597798799109, -1.46031914038324, 
## -1.59802159816531, -1.56922805174919, -1.30732186646871, -1.27470462970427, 
## -1.48447706788501, -1.50369881751537, -1.36343818040496, -1.62702048148246, 
## -1.59618916589, -1.41916483956829, -1.55870797763166, -1.54876099075454, 
## -1.4530595505586, -1.44964350687541, -1.47856488116327, -1.46339554627544, 
## -1.45610823795418, -1.50857859510293, -1.61051028430753, -1.47542961211584, 
## -1.39507651854449, -1.40112232198728, -1.43873343573348, -1.50572836428609, 
## -1.42796802003095, -1.54106567708497, -1.61565936633134, -1.47425706282928, 
## -1.351243091246, -1.54447628247046, -1.50735579633476, -1.50939485770457, 
## -1.63324884496924, -1.54106567708497, -1.56176129175274, -1.5321841580014, 
## -1.46147141032506, -1.56922805174919, -1.56702429459769, -1.66220820932499, 
## -1.62037180769071, -1.55393453399349, -1.61284631918178, -1.55696871583034, 
## -1.49163043179358, -1.54064041483767, -1.62702048148246, -1.64979521705898, 
## -1.55091237656436, -1.70042993006451, -1.47856488116327, -1.48250144041222, 
## -1.48131842185686, -1.58393660330422, -1.39189353324403), symmetry_worst = c(-0.948518649356509, 
## -1.81385035698237, -1.32733106607223, -0.45477319096941, -2.11345034894864, 
## -1.1682236591391, -1.61373662475265, -1.02267961437611, -1.02683069843001, 
## -1.68354734341879, -1.24784901230815, -1.77358490566038, -1.33518672982535, 
## -1.6339618107143, -1.28531705455465, -1.6655620734172, -1.54440602504168, 
## -2.04061017820884, -0.927595663724632, -1.32733106607223, -1.13650734223903, 
## -1.06281945915844, -2.13360799774556, -1.15165870140653, -1.8096965887973, 
## -0.898550724637681, -1.36622113937971, -1.30049180992225, -1.0606668389142, 
## -0.867991506742528, -2.48674163825318, -3.05560139165559, -1.77492901430698, 
## -1.6551406867881, -0.926655171438096, -1.29109441738175, -1.58921268038777, 
## -1.81593234259298, -1.7326167397057, -2.05470199922352, -2.12920069507667, 
## -1.78980963790557, -1.86694595618265, -1.47839238026982, -1.36288848203546, 
## -1.28886874970487, -2.04971159487191, -1.52453688425121, -1.66864419123526, 
## -1.75029300308675, -1.49108728516819, -1.47052804196902, -1.7280746576794, 
## -2.08248290463863, -1.47471570930472, -1.93064634475351, -1.90881552497052, 
## -1.88514344942906, -1.84189375197649, -1.26555094594136, -0.711630722733202, 
## -1.79389862629642, -1.80555636306181, -2.12130294982622, -2.16035148134034, 
## -1.44061358492657, -1.29020360821199, -1.63937262623419, -1.17981500374447, 
## -1.56291772787971, -1.98251531466764, -1.95593886064618, -1.86477939744156, 
## -1.75690378382672, -1.29287820852485, -2.23808718958129, -1.79868588198793, 
## -1.83623558432291, -1.88441061089581, -1.69358427576559, -1.76221765322067, 
## -1.56518133530493, -1.6618737658522, -1.76022231091473, -1.97385851785602, 
## -2.23903903399297, -2.08418500306765, -2.09701896683472, -1.69736933323268, 
## -2.92067831305123, -1.34029955479447, -1.49534990859376, -0.862405173218095, 
## -1.64178520364615, -1.71775470803394, -1.72097048123965, -2.11345034894864, 
## -1.39519918732522, -1.75227257197787, -1.39568851487512, -1.75491687996779, 
## -1.39031751810405, -1.8327118988321, -1.5869030194791, -2.83368244522832, 
## -1.8662233651353, -2.36228102239682, -1.58921268038777, -1.78505582052232, 
## -1.93444737682317, -1.35719831132296, -2.1702882811415, -1.82989993080951, 
## -1.74502943136569, -1.53774568838613, -1.47000564743879, -1.58748001667088, 
## -1.74044194062115, -1.22371054699841, -1.58459781164493, -2.02993266542511, 
## -1.97857340009713, -1.98884682155025, -1.82219881722458, -1.55334527259351, 
## -1.31360250635687, -2.19960525565808, -1.3384376262469, -1.8880789567987, 
## -1.77291338952272, -2.30331482911935, -2.07908508224002, -1.761552186221, 
## -1.72419461361929, -1.52672807929299, -2.58590167978055, -1.84189375197649, 
## -1.92837100659193, -1.91180505926984, -1.61905746683644, -2.17482860723433, 
## -2.73646490874199, -1.7424059428256, -1.12423733983001, -1.41145957686351, 
## -2.69970686492617, -1.69295456230508, -1.53222399672915, -1.9436150336255, 
## -1.53553390593274, -2.0144782295185, -2.28451216777744, -2.57748607123174, 
## -1.65148371670111, -1.63696483726654, -2.10910714179416, -1.53940725201044, 
## -1.69610635477286, -1.5366392784487, -1.00420884063055, -1.6429932929823, 
## -1.42243053204966, -1.65819656951131, -1.49481622664415, -2.152273992687, 
## -1.97229061149479, -2.99531908151406, -0.909879809896627, -1.31953065154646, 
## -1.73456843486636, -1.60961437842217, -1.83976900644465, -1.1650482901036, 
## -1.99520862275621, -1.61432667073989, -1.91105703267169, -1.76221765322067, 
## -1.42644629234223, -1.5869030194791, -1.74700067253984, -1.70688312201064, 
## -2.05137301508329, -1.80900561271674, -2.23903903399297, -2.22765904290076, 
## -1.74437304201154, -1.97621386243772, -1.4773407121036, -1.87564877129273, 
## -1.70116605098803, -1.80211646688469, -1.89323214521904, -1.42544102716222, 
## -1.87492129146064, -1.57713646794481, -1.82080359950435, -1.81454397349388, 
## -1.57029026747251, -1.44061358492657, -1.76755413484372, -1.50392220818424, 
## -1.37483653531413, -1.56291772787971, -1.86117654460525, -1.52672807929299, 
## -1.68855556781659, -2.04391267992747, -1.33148302326385, -1.91855666953546, 
## -2.03402937550546, -1.75756678635263, -2.24476359978009, -1.80831501247306, 
## -1.94745384990834, -2.13891543583711, -1.67918184092775, -1.50392220818424, 
## -1.59152680872644, -1.33797263040563, -1.80005700128253, -1.99600599001747, 
## -2.0282974725538, -2.16215288929717, -1.67918184092775, -1.49909254054319, 
## -2.19590679148344, -1.67483182287313, -1.44010458076891, -1.59152680872644, 
## -2.10910714179416, -1.76421617535536, -2.32034978967903, -2.49693750413288, 
## -1.8954469356581, -1.67111549107176, -1.85973822213232, -1.48472248992878, 
## -2.25147166248137, -2.25628265379374, -1.51147493610312, -1.88441061089581, 
## -2.20053140083543, -2.35710200950299, -1.38691285149289, -1.97621386243772, 
## -1.94976252766682, -1.43502693286363, -1.56801692100357, -2.56792473218354, 
## -2.17664889448334, -1.77157143202357, -1.57199813184547, -2.58710770259687, 
## -1.91630224993979, -1.6184651012597, -2.09787887217882, -0.682691445788528, 
## -1.87710496100081, -1.8575837490523, -2.36435780471985, -1.59036918522861, 
## -1.82359556450936, -1.63997532105098, -1.33518672982535, -1.48578087187875, 
## -1.76956117794169, -1.95748747642702, -1.9050882279228, -2.05553552826906, 
## -1.29735040698467, -1.71775470803394, -1.9276134381717, -1.61965012343041, 
## -1.66248771065756, -1.48102573092353, -1.14463853966039, -2.05386898390929, 
## -1.72677996249965, -1.5869030194791, -1.57828133482257, -1.8270941429378, 
## -1.06966617277021, -1.29243198883197, -1.88221458097702, -1.9405520311955, 
## -1.58459781164493, -2.11519149203145, -2.05386898390929, -2.0373157543605, 
## -2.23238955690485, -1.89470825224657, -2.08759559656644, -1.91555164097267, 
## -1.49695242258118, -1.992023920279, -1.8201065633589, -0.879561418604534, 
## -2.01044073656347, -1.82779501175476, -1.85686639270039, -1.46948348873484, 
## -1.56178753983291, -1.92458754139111, -1.85328560426458, -1.43654786582851, 
## -1.10310698975869, -1.45908956637123, -1.7668858325529, -1.82359556450936, 
## -2.01852794126098, -2.30630635873273, -1.9405520311955, -1.69043892539032, 
## -2.04473957139505, -1.5427374148158, -1.23254091917618, -1.62202366475002, 
## -1.98172599441121, -1.87346755991766, -2.48561304016257, -1.9298874594593, 
## -1.79389862629642, -2.03238919272756, -1.48313672307081, -2.17210263331832, 
## -2.21543355460736, -1.79458141198112, -2.00963469882313, -1.61550763031094, 
## -1.40256139153357, -1.66802714570214, -1.65270148685892, -2.10304969931109, 
## -1.55615268811606, -1.51744360676264, -1.60667853866973, -1.64965934300906, 
## -1.79663198301, -1.63036797913392, -1.761552186221, -1.78844957698328, 
## -1.81801774160606, -1.53774568838613, -2.12656307060115, -1.63937262623419, 
## -2.09701896683472, -2.29834277104179, -1.73587126708281, -1.95593886064618, 
## -1.63216356146074, -1.98014876083996, -1.54719228915016, -1.47471570930472, 
## -1.83130514088461, -1.98964038503598, -2.32742322407915, -1.96837895066273, 
## -2.63863607042713, -1.92307721324537, -1.66864419123526, -1.33333333333333, 
## -2.02340381113284, -2.19960525565808, -2.16576273636713, -2.17664889448334, 
## -2.04308629539244, -2.2678959977632, -1.72419461361929, -2.04556697031367, 
## -2.16485939411725, -1.64541308288245, -1.53002254809104, -2.23049544646902, 
## -1.74963384269795, -1.95053298754495, -1.61728124220584, -2.16938199225083, 
## -1.87783367164741, -1.54998513446837, -1.8575837490523, -2.07484971085364, 
## -1.93673329589478, -1.64783770109722, -1.97307433264506, -1.86405802003958, 
## -1.68792842242945, -1.47209664229407, -2.07738934943665, -2.12042821715165, 
## -1.52071210083019, -1.9551652341871, -1.5185320931284, -2.03649340670876, 
## -1.49962818425708, -0.924464209556753, -1.5921060405355, -2.73646490874199, 
## -1.8583015058264, -2.10910714179416, -1.992023920279, -2.01609664451249, 
## -1.56065842662025, -1.70879470776142, -1.6285750571483, -1.74765844497931, 
## -1.79321620905441, -1.64359778867931, -1.9193090083481, -2.13980199800554, 
## -1.34029955479447, -1.90285696125482, -2.18670321209332, -1.61550763031094, 
## -1.60726513354043, -1.96136698669732, -1.93597088022076, -1.60902663692541, 
## -1.29645450956099, -1.55953038653916, -1.9551652341871, -1.74700067253984, 
## -1.58632630060416, -2.30331482911935, -1.79253415992383, -1.77560160745518, 
## -2.18029662347075, -1.60550620691808, -1.77762154604325, -1.61668974825337, 
## -1.78641222831377, -2.14424337037824, -1.5427374148158, -1.83341585914639, 
## -2.21730749715585, -1.86261647624009, -1.95903791232448, -1.55278458694543, 
## -2.07654227570788, -2.05303648179888, -2.10650781176591, -2.02015126103685, 
## -2.20889437396518, -2.35194139889245, -2.21637021355784, -3.05398695719269, 
## -1.12767371557802, -1.69547535069575, -2.40652649239232, -1.9436150336255, 
## -2.24667694835586, -1.12843889570337, -1.7326167397057, -0.948518649356509, 
## -1.81385035698237, -1.32733106607223, -0.45477319096941, -2.11345034894864, 
## -1.61373662475265, -1.53774568838613, -1.02683069843001, -1.68354734341879, 
## -1.54886720493838, -1.33518672982535, -1.07947517897193, -1.6339618107143, 
## -1.28531705455465, -1.80142960634853, -2.04061017820884, -0.927595663724632, 
## -1.76488305748928, -1.32733106607223, -1.13650734223903, -1.06281945915844, 
## -2.13360799774556, -1.15165870140653, -1.8096965887973, -1.40799088295498, 
## -1.36622113937971, -1.0606668389142, -0.867991506742528, -1.33750782881173, 
## -2.48674163825318, -3.05560139165559, -1.77492901430698, -1.6551406867881, 
## -0.926655171438096, -1.27078700106134, -1.29109441738175, -1.24485540536667, 
## -1.58921268038777, -1.81593234259298, -2.05470199922352, -2.12920069507667, 
## -1.78980963790557, -1.63877023069482, -1.86694595618265, -1.36288848203546, 
## -1.28886874970487, -1.52453688425121, -1.35342087180173, -1.66864419123526, 
## -1.49108728516819, -1.43857892266411, -1.47052804196902, -1.7280746576794, 
## -2.08248290463863, -1.07583125960433, -2.09701896683472, -1.9598138139316, 
## -2.21262731871135, -1.47471570930472, -1.93064634475351, -1.90881552497052, 
## -1.88514344942906, -1.26555094594136, -0.711630722733202, -1.80555636306181, 
## -1.3676524573408, -2.12130294982622, -2.16035148134034, -1.44061358492657, 
## -1.29020360821199, -1.63937262623419, -1.66864419123526, -1.56291772787971, 
## -1.98251531466764, -1.95593886064618, -1.86477939744156, -1.75690378382672, 
## -2.23808718958129, -2.54780422488025, -1.79868588198793, -1.83623558432291, 
## -1.88441061089581, -1.69358427576559, -1.85328560426458, -1.90583283432253, 
## -1.76221765322067, -1.56518133530493, -1.77560160745518, -1.14075866336532, 
## -1.97385851785602, -2.23903903399297, -2.08418500306765, -2.09701896683472, 
## -1.69736933323268, -2.92067831305123, -1.49534990859376, -1.64178520364615, 
## -1.71775470803394, -1.52344281721127, -1.72097048123965, -2.41941738241592, 
## -1.39519918732522, -1.75227257197787, -1.9298874594593, -1.47891857229768, 
## -1.39568851487512, -1.39031751810405, -1.8327118988321, -1.76022231091473, 
## -1.8662233651353, -1.42293173005074, -1.58921268038777, -1.78505582052232, 
## -1.93444737682317, -1.35719831132296, -2.1702882811415, -0.632034671494683, 
## -1.74502943136569, -1.85543287977493, -2.12568498503517, -1.53774568838613, 
## -1.47000564743879, -1.58748001667088, -1.74044194062115, -1.22371054699841, 
## -1.58459781164493, -1.97857340009713, -1.98884682155025, -1.82219881722458, 
## -1.55334527259351, -2.04639487720259, -1.31360250635687, -2.19960525565808, 
## -1.8880789567987, -2.37478639259807, -1.77291338952272, -2.30331482911935, 
## -2.07908508224002, -1.761552186221, -1.72419461361929, -1.52672807929299, 
## -2.58590167978055, -1.84189375197649, -1.92837100659193, -1.91180505926984, 
## -2.17482860723433, -2.73646490874199, -1.7424059428256, -1.12423733983001, 
## -2.69970686492617, -1.54942603766446, -1.69295456230508, -1.53222399672915, 
## -1.9436150336255, -1.53553390593274, -2.0144782295185, -0.782612903765608, 
## -2.28451216777744, -2.57748607123174, -1.65148371670111, -1.60374985078224, 
## -2.10910714179416, -1.53940725201044, -0.900989043238912, -1.87201546331183, 
## -1.69610635477286, -1.5366392784487, -1.00420884063055, -1.6429932929823, 
## -1.65819656951131, -1.49481622664415, -1.11286403182345, -2.152273992687, 
## -1.97229061149479, -2.92664639082147, -2.99531908151406, -0.909879809896627, 
## -1.31953065154646, -1.73456843486636, -1.60961437842217, -1.83976900644465, 
## -2.09960030845394, -1.61255739919515, -1.1650482901036, -1.99520862275621, 
## -1.61432667073989, -1.6798045339746, -1.76221765322067, -1.42644629234223, 
## -1.5869030194791, -1.74700067253984, -1.70688312201064, -2.05137301508329, 
## -1.80900561271674, -2.23903903399297, -1.59036918522861, -2.22765904290076, 
## -2.60043706228236, -1.97621386243772, -1.71326660034339, -1.4773407121036, 
## -1.70116605098803, -1.72484044638468, -1.89323214521904, -1.87492129146064, 
## -1.82080359950435, -1.81454397349388, -1.44061358492657, -1.61079072337059, 
## -1.76755413484372, -1.50392220818424, -1.49374959569404, -1.37483653531413, 
## -1.56291772787971, -2.0389619586101, -1.61137932729061, -1.86117654460525, 
## -1.52672807929299, -1.73456843486636, -1.68855556781659, -2.04391267992747, 
## -1.33148302326385, -1.91855666953546, -2.03402937550546, -1.82569355949001, 
## -1.75756678635263, -1.75227257197787, -1.99680383488716, -2.24476359978009, 
## -1.80831501247306, -1.94745384990834, -2.13891543583711, -1.67918184092775, 
## -1.59152680872644, -1.33797263040563, -1.80005700128253, -1.99600599001747, 
## -2.0282974725538, -2.16215288929717, -1.67918184092775, -1.49909254054319, 
## -2.19590679148344, -1.67483182287313, -1.44010458076891, -1.59152680872644, 
## -2.10910714179416, -2.32034978967903, -1.8954469356581, -2.23808718958129, 
## -1.85973822213232, -1.48472248992878, -2.25147166248137, -2.25628265379374, 
## -1.51147493610312, -1.65697329692419, -2.20053140083543, -2.35710200950299, 
## -1.38691285149289, -1.97621386243772, -1.43502693286363, -1.56801692100357, 
## -2.56792473218354, -1.77157143202357, -1.57199813184547, -2.58710770259687, 
## -1.91630224993979, -1.6184651012597, -2.09787887217882, -0.682691445788528, 
## -1.87710496100081, -2.29239898930327, -1.59036918522861, -1.82359556450936, 
## -1.63997532105098, -1.33518672982535, -1.48578087187875, -1.76956117794169, 
## -1.95748747642702, -1.29735040698467, -1.71775470803394, -1.9276134381717, 
## -1.61965012343041, -1.66248771065756, -1.48102573092353, -1.80349130564482, 
## -2.05386898390929, -1.72677996249965, -1.74568616603648, -1.8270941429378, 
## -1.06966617277021, -1.9405520311955, -1.58459781164493, -2.11519149203145, 
## -2.0373157543605, -2.23238955690485, -1.89470825224657, -2.08759559656644, 
## -1.8201065633589, -1.91555164097267, -1.52563197108336, -1.992023920279, 
## -1.8201065633589, -0.879561418604534, -1.82779501175476, -1.85686639270039, 
## -1.46948348873484, -1.56178753983291, -1.92458754139111, -1.85328560426458, 
## -1.43654786582851, -1.10310698975869, -1.53442956921802, -2.27276302636913, 
## -1.82359556450936, -2.01852794126098, -1.9668175187901, -2.30630635873273, 
## -1.9405520311955, -1.69043892539032, -2.04473957139505, -1.5427374148158, 
## -1.23254091917618, -1.62202366475002, -1.98172599441121, -1.87346755991766, 
## -1.9298874594593, -1.79389862629642, -1.51093383510043, -2.03238919272756, 
## -1.53167324846093, -2.21543355460736, -2.00963469882313, -1.61550763031094, 
## -1.40256139153357, -1.65270148685892, -2.10304969931109, -1.55615268811606, 
## -1.51744360676264, -1.73196685431038, -1.79663198301, -1.66617787553383, 
## -1.63036797913392, -1.761552186221, -1.78844957698328, -1.53774568838613, 
## -2.12656307060115, -1.63937262623419, -1.67297224689699, -2.09701896683472, 
## -2.29834277104179, -1.73587126708281, -1.95593886064618, -1.63216356146074, 
## -1.64723109301103, -1.98014876083996, -1.54719228915016, -1.47471570930472, 
## -1.83130514088461, -1.98964038503598, -2.32742322407915, -2, 
## -2.63863607042713, -1.84971477076266, -1.75161236742064, -1.66864419123526, 
## -1.33333333333333, -2.02340381113284, -2.19960525565808, -2.16576273636713, 
## -2.17664889448334, -2.04308629539244, -1.49748708391334, -1.72419461361929, 
## -2.04556697031367, -2.16485939411725, -2.15137939524202, -1.89175771526834, 
## -2.130081027822, -1.53002254809104, -1.74963384269795, -1.95053298754495, 
## -1.61728124220584, -2.16938199225083, -1.87783367164741, -1.54998513446837, 
## -2.07908508224002, -1.8575837490523, -1.93673329589478, -1.68792842242945, 
## -1.47209664229407, -2.07738934943665, -2.12042821715165, -1.98567729636041, 
## -1.9551652341871, -1.5185320931284, -2.03649340670876, -1.49962818425708, 
## -1.77694787300249, -0.924464209556753, -1.50769247373693, -2.18853908291695, 
## -1.8583015058264, -2.10910714179416, -1.63036797913392, -2.01609664451249, 
## -2.28647979689047, -1.70879470776142, -1.6285750571483, -1.74765844497931, 
## -1.8426027846949, -2.1702882811415, -1.64359778867931, -1.9193090083481, 
## -2.13980199800554, -1.34029955479447, -1.90285696125482, -2.18670321209332, 
## -1.61550763031094, -1.60726513354043, -1.96136698669732, -1.93597088022076, 
## -1.60902663692541, -1.55953038653916, -1.9551652341871, -1.74700067253984, 
## -1.6618737658522, -1.37195739086684, -1.79253415992383, -2.01771701845154, 
## -1.53222399672915, -1.77560160745518, -1.52453688425121, -2.18029662347075, 
## -1.77762154604325, -1.61668974825337, -2.14424337037824, -1.5427374148158, 
## -1.83341585914639, -2.21730749715585, -1.89102112820481, -1.86261647624009, 
## -1.85543287977493, -1.95903791232448, -1.6160985443674, -2.03402937550546, 
## -1.55278458694543, -2.07654227570788, -2.05303648179888, -2.10650781176591, 
## -2.23903903399297, -2.2051713353118, -2.02015126103685, -2.20889437396518, 
## -2.21637021355784, -3.05398695719269, -1.12767371557802, -1.69547535069575, 
## -2.40652649239232, -1.9436150336255, -1.12843889570337), .outcome = c(1L, 
## 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 
## 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 
## 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 
## 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 
## 2L, 1L, 1L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 
## 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 2L, 1L, 2L, 2L, 
## 1L, 1L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 
## 1L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 
## 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 
## 1L, 2L, 1L, 2L, 2L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 
## 1L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 
## 2L, 2L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 
## 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 
## 1L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 1L, 
## 2L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 
## 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 
## 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
## 1L, 1L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
## 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 
## 1L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 1L, 2L, 
## 1L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 
## 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 1L, 2L, 1L, 
## 2L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 1L, 1L, 2L, 2L, 1L, 1L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 1L, 
## 2L, 2L, 1L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 1L, 2L, 2L, 1L, 2L, 2L, 
## 2L, 1L, 1L, 1L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 
## 1L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 
## 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 1L, 2L, 2L, 2L, 1L, 1L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 1L, 
## 2L, 2L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 
## 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 1L, 2L, 2L, 
## 1L, 2L, 2L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 
## 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 1L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 1L, 1L, 
## 2L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 
## 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 
## 2L, 1L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 1L, 2L, 2L, 2L, 2L, 1L, 2L, 
## 2L, 1L, 2L, 1L, 2L, 1L, 1L, 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
## 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 1L)), 
##     mfinal = 100, coeflearn = "Breiman", control = list(minsplit = 0, 
##         minbucket = 0, cp = -1, maxcompete = 4L, maxsurrogate = 5L, 
##         usesurrogate = 2L, surrogatestyle = 0L, maxdepth = 6, 
##         xval = 0))
## 
## $xNames
## [1] "texture_mean"     "smoothness_mean"  "compactness_se"   "texture_worst"   
## [5] "smoothness_worst" "symmetry_worst"  
## 
## $problemType
## [1] "Classification"
## 
## $tuneValue
##   mfinal maxdepth coeflearn
## 6    100        6   Breiman
## 
## $obsLevels
## [1] "M" "B"
## attr(,"ordered")
## [1] FALSE
## 
## $param
## list()
## 
## attr(,"vardep.summary")
##   M   B 
## 340 572 
## attr(,"class")
## [1] "boosting"
MBS_AB_Tune$results
##   coeflearn maxdepth mfinal       ROC      Sens      Spec      ROCSD     SensSD
## 1   Breiman        4     50 0.9506186 0.8800000 0.9335561 0.01896836 0.04405224
## 3   Breiman        5     50 0.9575898 0.8941176 0.9412265 0.01767991 0.05369829
## 5   Breiman        6     50 0.9619147 0.8988235 0.9415927 0.01476659 0.04458899
## 2   Breiman        4    100 0.9575073 0.8994118 0.9384409 0.01794364 0.04908329
## 4   Breiman        5    100 0.9631176 0.8982353 0.9412265 0.01755778 0.04931041
## 6   Breiman        6    100 0.9647554 0.9011765 0.9398352 0.01396669 0.04868513
##       SpecSD
## 1 0.02758453
## 3 0.02202985
## 5 0.02460794
## 2 0.02898207
## 4 0.02160683
## 6 0.02269900
(MBS_AB_Train_AUROC <- MBS_AB_Tune$results[MBS_AB_Tune$results$mfinal==MBS_AB_Tune$bestTune$mfinal &
                                           MBS_AB_Tune$results$maxdepth==MBS_AB_Tune$bestTune$maxdepth &
                                           MBS_AB_Tune$results$coeflearn==MBS_AB_Tune$bestTune$coeflearn,
                                           c("ROC")])
## [1] 0.9647554
##################################
# Identifying and plotting the
# best model predictors
##################################
MBS_AB_VarImp <- varImp(MBS_AB_Tune, scale = TRUE)
plot(MBS_AB_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked Variable Importance : Adaptive Boosting",
     xlab="Scaled Variable Importance Metrics",
     ylab="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
MBS_AB_Test <- data.frame(MBS_AB_Test_Observed = MA_Test$diagnosis,
                          MBS_AB_Test_Predicted = predict(MBS_AB_Tune,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

##################################
# Reporting the independent evaluation results
# for the test set
##################################
MBS_AB_Test_ROC <- roc(response = MBS_AB_Test$MBS_AB_Test_Observed,
                       predictor = MBS_AB_Test$MBS_AB_Test_Predicted.M,
                       levels = rev(levels(MBS_AB_Test$MBS_AB_Test_Observed)))

(MBS_AB_Test_AUROC <- auc(MBS_AB_Test_ROC)[1])
## [1] 0.9936284

1.5.2 Stochastic Gradient Boosting (MBS_GBM)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
GBM_Grid = expand.grid(n.trees = 500,
                      interaction.depth = c(4,5,6),
                      shrinkage = c(0.1,0.01,0.001),
                      n.minobsinnode = c(5, 10, 15))

##################################
# Running the stochastic gradient boosting model
# by setting the caret method to 'gbm'
##################################
set.seed(12345678)
MBS_GBM_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                 y = MA_Train$diagnosis,
                 method = "gbm",
                 tuneGrid = GBM_Grid,
                 metric = "ROC",
                 trControl = RKFold_Control)
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0003
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0005
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2679             nan     0.0010    0.0004
##     80        1.2514             nan     0.0010    0.0003
##    100        1.2355             nan     0.0010    0.0003
##    120        1.2202             nan     0.0010    0.0004
##    140        1.2053             nan     0.0010    0.0003
##    160        1.1908             nan     0.0010    0.0003
##    180        1.1771             nan     0.0010    0.0003
##    200        1.1639             nan     0.0010    0.0003
##    220        1.1512             nan     0.0010    0.0003
##    240        1.1384             nan     0.0010    0.0003
##    260        1.1261             nan     0.0010    0.0003
##    280        1.1143             nan     0.0010    0.0003
##    300        1.1027             nan     0.0010    0.0002
##    320        1.0915             nan     0.0010    0.0002
##    340        1.0806             nan     0.0010    0.0003
##    360        1.0701             nan     0.0010    0.0002
##    380        1.0599             nan     0.0010    0.0002
##    400        1.0500             nan     0.0010    0.0002
##    420        1.0404             nan     0.0010    0.0002
##    440        1.0309             nan     0.0010    0.0002
##    460        1.0219             nan     0.0010    0.0002
##    480        1.0129             nan     0.0010    0.0002
##    500        1.0042             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0005
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0005
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2690             nan     0.0010    0.0004
##     80        1.2529             nan     0.0010    0.0004
##    100        1.2371             nan     0.0010    0.0004
##    120        1.2217             nan     0.0010    0.0004
##    140        1.2066             nan     0.0010    0.0003
##    160        1.1925             nan     0.0010    0.0003
##    180        1.1784             nan     0.0010    0.0003
##    200        1.1651             nan     0.0010    0.0003
##    220        1.1524             nan     0.0010    0.0003
##    240        1.1396             nan     0.0010    0.0003
##    260        1.1271             nan     0.0010    0.0003
##    280        1.1153             nan     0.0010    0.0003
##    300        1.1037             nan     0.0010    0.0003
##    320        1.0924             nan     0.0010    0.0002
##    340        1.0816             nan     0.0010    0.0002
##    360        1.0711             nan     0.0010    0.0002
##    380        1.0608             nan     0.0010    0.0002
##    400        1.0509             nan     0.0010    0.0002
##    420        1.0410             nan     0.0010    0.0002
##    440        1.0318             nan     0.0010    0.0002
##    460        1.0224             nan     0.0010    0.0002
##    480        1.0135             nan     0.0010    0.0002
##    500        1.0047             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2853             nan     0.0010    0.0004
##     60        1.2684             nan     0.0010    0.0004
##     80        1.2527             nan     0.0010    0.0004
##    100        1.2374             nan     0.0010    0.0003
##    120        1.2225             nan     0.0010    0.0003
##    140        1.2079             nan     0.0010    0.0003
##    160        1.1940             nan     0.0010    0.0003
##    180        1.1803             nan     0.0010    0.0003
##    200        1.1671             nan     0.0010    0.0003
##    220        1.1540             nan     0.0010    0.0003
##    240        1.1416             nan     0.0010    0.0003
##    260        1.1298             nan     0.0010    0.0003
##    280        1.1178             nan     0.0010    0.0003
##    300        1.1065             nan     0.0010    0.0002
##    320        1.0956             nan     0.0010    0.0003
##    340        1.0848             nan     0.0010    0.0002
##    360        1.0744             nan     0.0010    0.0002
##    380        1.0642             nan     0.0010    0.0002
##    400        1.0542             nan     0.0010    0.0002
##    420        1.0446             nan     0.0010    0.0002
##    440        1.0353             nan     0.0010    0.0002
##    460        1.0260             nan     0.0010    0.0002
##    480        1.0172             nan     0.0010    0.0002
##    500        1.0084             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0005
##      8        1.3134             nan     0.0010    0.0005
##      9        1.3124             nan     0.0010    0.0005
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2833             nan     0.0010    0.0004
##     60        1.2654             nan     0.0010    0.0004
##     80        1.2479             nan     0.0010    0.0003
##    100        1.2310             nan     0.0010    0.0003
##    120        1.2152             nan     0.0010    0.0003
##    140        1.1995             nan     0.0010    0.0003
##    160        1.1842             nan     0.0010    0.0003
##    180        1.1696             nan     0.0010    0.0003
##    200        1.1553             nan     0.0010    0.0003
##    220        1.1416             nan     0.0010    0.0003
##    240        1.1285             nan     0.0010    0.0002
##    260        1.1155             nan     0.0010    0.0003
##    280        1.1031             nan     0.0010    0.0003
##    300        1.0909             nan     0.0010    0.0002
##    320        1.0792             nan     0.0010    0.0003
##    340        1.0676             nan     0.0010    0.0003
##    360        1.0562             nan     0.0010    0.0002
##    380        1.0456             nan     0.0010    0.0002
##    400        1.0351             nan     0.0010    0.0002
##    420        1.0249             nan     0.0010    0.0002
##    440        1.0149             nan     0.0010    0.0002
##    460        1.0054             nan     0.0010    0.0002
##    480        0.9959             nan     0.0010    0.0002
##    500        0.9867             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2840             nan     0.0010    0.0004
##     60        1.2666             nan     0.0010    0.0004
##     80        1.2495             nan     0.0010    0.0004
##    100        1.2330             nan     0.0010    0.0004
##    120        1.2168             nan     0.0010    0.0003
##    140        1.2010             nan     0.0010    0.0003
##    160        1.1860             nan     0.0010    0.0003
##    180        1.1716             nan     0.0010    0.0003
##    200        1.1574             nan     0.0010    0.0003
##    220        1.1436             nan     0.0010    0.0003
##    240        1.1304             nan     0.0010    0.0003
##    260        1.1174             nan     0.0010    0.0003
##    280        1.1051             nan     0.0010    0.0002
##    300        1.0930             nan     0.0010    0.0003
##    320        1.0812             nan     0.0010    0.0003
##    340        1.0698             nan     0.0010    0.0002
##    360        1.0589             nan     0.0010    0.0002
##    380        1.0479             nan     0.0010    0.0003
##    400        1.0374             nan     0.0010    0.0002
##    420        1.0270             nan     0.0010    0.0002
##    440        1.0172             nan     0.0010    0.0002
##    460        1.0073             nan     0.0010    0.0002
##    480        0.9980             nan     0.0010    0.0002
##    500        0.9890             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0005
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0005
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2841             nan     0.0010    0.0004
##     60        1.2665             nan     0.0010    0.0004
##     80        1.2492             nan     0.0010    0.0004
##    100        1.2330             nan     0.0010    0.0003
##    120        1.2172             nan     0.0010    0.0003
##    140        1.2019             nan     0.0010    0.0004
##    160        1.1870             nan     0.0010    0.0003
##    180        1.1727             nan     0.0010    0.0003
##    200        1.1587             nan     0.0010    0.0003
##    220        1.1454             nan     0.0010    0.0003
##    240        1.1322             nan     0.0010    0.0003
##    260        1.1195             nan     0.0010    0.0002
##    280        1.1069             nan     0.0010    0.0002
##    300        1.0950             nan     0.0010    0.0003
##    320        1.0835             nan     0.0010    0.0002
##    340        1.0719             nan     0.0010    0.0003
##    360        1.0610             nan     0.0010    0.0002
##    380        1.0503             nan     0.0010    0.0002
##    400        1.0401             nan     0.0010    0.0002
##    420        1.0298             nan     0.0010    0.0002
##    440        1.0200             nan     0.0010    0.0002
##    460        1.0102             nan     0.0010    0.0002
##    480        1.0008             nan     0.0010    0.0002
##    500        0.9917             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0005
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0005
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0005
##     10        1.3112             nan     0.0010    0.0005
##     20        1.3014             nan     0.0010    0.0004
##     40        1.2821             nan     0.0010    0.0004
##     60        1.2634             nan     0.0010    0.0004
##     80        1.2455             nan     0.0010    0.0004
##    100        1.2280             nan     0.0010    0.0004
##    120        1.2111             nan     0.0010    0.0004
##    140        1.1948             nan     0.0010    0.0004
##    160        1.1793             nan     0.0010    0.0003
##    180        1.1643             nan     0.0010    0.0003
##    200        1.1497             nan     0.0010    0.0003
##    220        1.1354             nan     0.0010    0.0003
##    240        1.1217             nan     0.0010    0.0002
##    260        1.1082             nan     0.0010    0.0003
##    280        1.0952             nan     0.0010    0.0003
##    300        1.0828             nan     0.0010    0.0003
##    320        1.0705             nan     0.0010    0.0003
##    340        1.0587             nan     0.0010    0.0003
##    360        1.0473             nan     0.0010    0.0002
##    380        1.0361             nan     0.0010    0.0003
##    400        1.0250             nan     0.0010    0.0002
##    420        1.0142             nan     0.0010    0.0002
##    440        1.0037             nan     0.0010    0.0002
##    460        0.9937             nan     0.0010    0.0002
##    480        0.9839             nan     0.0010    0.0002
##    500        0.9742             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3201             nan     0.0010    0.0005
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0005
##      5        1.3159             nan     0.0010    0.0005
##      6        1.3149             nan     0.0010    0.0005
##      7        1.3138             nan     0.0010    0.0005
##      8        1.3129             nan     0.0010    0.0005
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0004
##     20        1.3014             nan     0.0010    0.0004
##     40        1.2823             nan     0.0010    0.0005
##     60        1.2637             nan     0.0010    0.0005
##     80        1.2459             nan     0.0010    0.0004
##    100        1.2288             nan     0.0010    0.0004
##    120        1.2120             nan     0.0010    0.0004
##    140        1.1959             nan     0.0010    0.0004
##    160        1.1805             nan     0.0010    0.0003
##    180        1.1653             nan     0.0010    0.0003
##    200        1.1507             nan     0.0010    0.0003
##    220        1.1366             nan     0.0010    0.0003
##    240        1.1230             nan     0.0010    0.0003
##    260        1.1099             nan     0.0010    0.0003
##    280        1.0969             nan     0.0010    0.0003
##    300        1.0842             nan     0.0010    0.0003
##    320        1.0720             nan     0.0010    0.0003
##    340        1.0602             nan     0.0010    0.0002
##    360        1.0487             nan     0.0010    0.0002
##    380        1.0375             nan     0.0010    0.0003
##    400        1.0267             nan     0.0010    0.0002
##    420        1.0163             nan     0.0010    0.0002
##    440        1.0060             nan     0.0010    0.0002
##    460        0.9959             nan     0.0010    0.0002
##    480        0.9863             nan     0.0010    0.0002
##    500        0.9766             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0005
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2830             nan     0.0010    0.0005
##     60        1.2647             nan     0.0010    0.0004
##     80        1.2471             nan     0.0010    0.0004
##    100        1.2300             nan     0.0010    0.0003
##    120        1.2137             nan     0.0010    0.0004
##    140        1.1976             nan     0.0010    0.0004
##    160        1.1825             nan     0.0010    0.0003
##    180        1.1676             nan     0.0010    0.0003
##    200        1.1533             nan     0.0010    0.0003
##    220        1.1395             nan     0.0010    0.0003
##    240        1.1261             nan     0.0010    0.0003
##    260        1.1129             nan     0.0010    0.0003
##    280        1.1001             nan     0.0010    0.0003
##    300        1.0877             nan     0.0010    0.0003
##    320        1.0758             nan     0.0010    0.0002
##    340        1.0643             nan     0.0010    0.0002
##    360        1.0531             nan     0.0010    0.0002
##    380        1.0422             nan     0.0010    0.0002
##    400        1.0315             nan     0.0010    0.0002
##    420        1.0209             nan     0.0010    0.0002
##    440        1.0106             nan     0.0010    0.0002
##    460        1.0006             nan     0.0010    0.0002
##    480        0.9910             nan     0.0010    0.0002
##    500        0.9816             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3030             nan     0.0100    0.0048
##      3        1.2939             nan     0.0100    0.0039
##      4        1.2852             nan     0.0100    0.0037
##      5        1.2763             nan     0.0100    0.0038
##      6        1.2681             nan     0.0100    0.0039
##      7        1.2599             nan     0.0100    0.0039
##      8        1.2523             nan     0.0100    0.0032
##      9        1.2440             nan     0.0100    0.0037
##     10        1.2357             nan     0.0100    0.0035
##     20        1.1634             nan     0.0100    0.0029
##     40        1.0487             nan     0.0100    0.0023
##     60        0.9632             nan     0.0100    0.0015
##     80        0.8958             nan     0.0100    0.0012
##    100        0.8410             nan     0.0100    0.0010
##    120        0.7963             nan     0.0100    0.0008
##    140        0.7597             nan     0.0100    0.0005
##    160        0.7265             nan     0.0100    0.0005
##    180        0.7008             nan     0.0100    0.0003
##    200        0.6782             nan     0.0100    0.0002
##    220        0.6575             nan     0.0100    0.0003
##    240        0.6391             nan     0.0100   -0.0001
##    260        0.6229             nan     0.0100    0.0001
##    280        0.6073             nan     0.0100   -0.0001
##    300        0.5937             nan     0.0100    0.0000
##    320        0.5824             nan     0.0100    0.0001
##    340        0.5711             nan     0.0100    0.0000
##    360        0.5597             nan     0.0100    0.0000
##    380        0.5500             nan     0.0100   -0.0001
##    400        0.5398             nan     0.0100   -0.0000
##    420        0.5296             nan     0.0100   -0.0000
##    440        0.5201             nan     0.0100    0.0001
##    460        0.5116             nan     0.0100   -0.0001
##    480        0.5030             nan     0.0100   -0.0002
##    500        0.4955             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0047
##      2        1.3022             nan     0.0100    0.0038
##      3        1.2943             nan     0.0100    0.0036
##      4        1.2858             nan     0.0100    0.0037
##      5        1.2769             nan     0.0100    0.0043
##      6        1.2690             nan     0.0100    0.0036
##      7        1.2604             nan     0.0100    0.0039
##      8        1.2519             nan     0.0100    0.0039
##      9        1.2438             nan     0.0100    0.0038
##     10        1.2360             nan     0.0100    0.0037
##     20        1.1652             nan     0.0100    0.0032
##     40        1.0492             nan     0.0100    0.0017
##     60        0.9614             nan     0.0100    0.0017
##     80        0.8937             nan     0.0100    0.0011
##    100        0.8402             nan     0.0100    0.0007
##    120        0.7961             nan     0.0100    0.0007
##    140        0.7584             nan     0.0100    0.0004
##    160        0.7285             nan     0.0100    0.0004
##    180        0.7027             nan     0.0100    0.0001
##    200        0.6807             nan     0.0100    0.0002
##    220        0.6608             nan     0.0100    0.0002
##    240        0.6425             nan     0.0100    0.0001
##    260        0.6270             nan     0.0100    0.0001
##    280        0.6116             nan     0.0100    0.0001
##    300        0.5979             nan     0.0100   -0.0000
##    320        0.5845             nan     0.0100    0.0002
##    340        0.5735             nan     0.0100    0.0000
##    360        0.5626             nan     0.0100    0.0000
##    380        0.5523             nan     0.0100    0.0000
##    400        0.5424             nan     0.0100   -0.0000
##    420        0.5332             nan     0.0100    0.0001
##    440        0.5235             nan     0.0100    0.0000
##    460        0.5157             nan     0.0100    0.0001
##    480        0.5072             nan     0.0100   -0.0001
##    500        0.4992             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0044
##      2        1.3015             nan     0.0100    0.0041
##      3        1.2933             nan     0.0100    0.0035
##      4        1.2839             nan     0.0100    0.0042
##      5        1.2754             nan     0.0100    0.0041
##      6        1.2673             nan     0.0100    0.0033
##      7        1.2596             nan     0.0100    0.0037
##      8        1.2511             nan     0.0100    0.0038
##      9        1.2434             nan     0.0100    0.0033
##     10        1.2357             nan     0.0100    0.0034
##     20        1.1644             nan     0.0100    0.0029
##     40        1.0521             nan     0.0100    0.0016
##     60        0.9660             nan     0.0100    0.0015
##     80        0.8986             nan     0.0100    0.0015
##    100        0.8443             nan     0.0100    0.0009
##    120        0.7998             nan     0.0100    0.0006
##    140        0.7647             nan     0.0100    0.0006
##    160        0.7346             nan     0.0100    0.0003
##    180        0.7098             nan     0.0100    0.0001
##    200        0.6868             nan     0.0100    0.0005
##    220        0.6672             nan     0.0100    0.0002
##    240        0.6488             nan     0.0100    0.0002
##    260        0.6319             nan     0.0100    0.0002
##    280        0.6178             nan     0.0100    0.0001
##    300        0.6045             nan     0.0100    0.0001
##    320        0.5919             nan     0.0100    0.0000
##    340        0.5802             nan     0.0100    0.0002
##    360        0.5693             nan     0.0100    0.0001
##    380        0.5600             nan     0.0100   -0.0001
##    400        0.5509             nan     0.0100   -0.0000
##    420        0.5414             nan     0.0100    0.0001
##    440        0.5323             nan     0.0100   -0.0001
##    460        0.5233             nan     0.0100    0.0000
##    480        0.5161             nan     0.0100    0.0000
##    500        0.5077             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3021             nan     0.0100    0.0044
##      3        1.2929             nan     0.0100    0.0044
##      4        1.2836             nan     0.0100    0.0042
##      5        1.2743             nan     0.0100    0.0039
##      6        1.2662             nan     0.0100    0.0035
##      7        1.2575             nan     0.0100    0.0041
##      8        1.2482             nan     0.0100    0.0043
##      9        1.2399             nan     0.0100    0.0043
##     10        1.2314             nan     0.0100    0.0039
##     20        1.1558             nan     0.0100    0.0033
##     40        1.0344             nan     0.0100    0.0020
##     60        0.9422             nan     0.0100    0.0017
##     80        0.8705             nan     0.0100    0.0013
##    100        0.8140             nan     0.0100    0.0009
##    120        0.7680             nan     0.0100    0.0005
##    140        0.7288             nan     0.0100    0.0005
##    160        0.6942             nan     0.0100    0.0003
##    180        0.6666             nan     0.0100    0.0002
##    200        0.6426             nan     0.0100    0.0003
##    220        0.6203             nan     0.0100    0.0001
##    240        0.6003             nan     0.0100    0.0001
##    260        0.5831             nan     0.0100    0.0002
##    280        0.5687             nan     0.0100   -0.0000
##    300        0.5544             nan     0.0100    0.0000
##    320        0.5414             nan     0.0100    0.0000
##    340        0.5279             nan     0.0100   -0.0000
##    360        0.5154             nan     0.0100    0.0002
##    380        0.5035             nan     0.0100    0.0001
##    400        0.4925             nan     0.0100   -0.0001
##    420        0.4822             nan     0.0100    0.0001
##    440        0.4717             nan     0.0100   -0.0002
##    460        0.4628             nan     0.0100    0.0000
##    480        0.4550             nan     0.0100   -0.0001
##    500        0.4458             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0048
##      2        1.3028             nan     0.0100    0.0044
##      3        1.2933             nan     0.0100    0.0046
##      4        1.2838             nan     0.0100    0.0043
##      5        1.2752             nan     0.0100    0.0036
##      6        1.2655             nan     0.0100    0.0044
##      7        1.2567             nan     0.0100    0.0040
##      8        1.2477             nan     0.0100    0.0042
##      9        1.2397             nan     0.0100    0.0035
##     10        1.2316             nan     0.0100    0.0041
##     20        1.1569             nan     0.0100    0.0033
##     40        1.0356             nan     0.0100    0.0023
##     60        0.9436             nan     0.0100    0.0017
##     80        0.8721             nan     0.0100    0.0013
##    100        0.8155             nan     0.0100    0.0009
##    120        0.7694             nan     0.0100    0.0007
##    140        0.7306             nan     0.0100    0.0005
##    160        0.6984             nan     0.0100    0.0004
##    180        0.6699             nan     0.0100    0.0004
##    200        0.6472             nan     0.0100    0.0002
##    220        0.6256             nan     0.0100    0.0002
##    240        0.6068             nan     0.0100   -0.0000
##    260        0.5894             nan     0.0100   -0.0000
##    280        0.5730             nan     0.0100    0.0003
##    300        0.5594             nan     0.0100   -0.0001
##    320        0.5468             nan     0.0100    0.0000
##    340        0.5346             nan     0.0100   -0.0001
##    360        0.5224             nan     0.0100    0.0000
##    380        0.5101             nan     0.0100    0.0000
##    400        0.4992             nan     0.0100   -0.0001
##    420        0.4891             nan     0.0100    0.0000
##    440        0.4794             nan     0.0100   -0.0000
##    460        0.4698             nan     0.0100    0.0001
##    480        0.4609             nan     0.0100    0.0000
##    500        0.4523             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3016             nan     0.0100    0.0049
##      3        1.2923             nan     0.0100    0.0045
##      4        1.2832             nan     0.0100    0.0042
##      5        1.2745             nan     0.0100    0.0040
##      6        1.2661             nan     0.0100    0.0038
##      7        1.2579             nan     0.0100    0.0038
##      8        1.2494             nan     0.0100    0.0038
##      9        1.2413             nan     0.0100    0.0033
##     10        1.2331             nan     0.0100    0.0037
##     20        1.1592             nan     0.0100    0.0027
##     40        1.0416             nan     0.0100    0.0022
##     60        0.9520             nan     0.0100    0.0018
##     80        0.8812             nan     0.0100    0.0011
##    100        0.8246             nan     0.0100    0.0010
##    120        0.7767             nan     0.0100    0.0009
##    140        0.7376             nan     0.0100    0.0007
##    160        0.7065             nan     0.0100    0.0004
##    180        0.6781             nan     0.0100    0.0003
##    200        0.6528             nan     0.0100    0.0004
##    220        0.6337             nan     0.0100    0.0002
##    240        0.6153             nan     0.0100    0.0001
##    260        0.5994             nan     0.0100    0.0001
##    280        0.5829             nan     0.0100    0.0001
##    300        0.5690             nan     0.0100    0.0001
##    320        0.5556             nan     0.0100   -0.0001
##    340        0.5424             nan     0.0100    0.0000
##    360        0.5312             nan     0.0100   -0.0000
##    380        0.5196             nan     0.0100   -0.0001
##    400        0.5087             nan     0.0100    0.0000
##    420        0.4985             nan     0.0100   -0.0002
##    440        0.4892             nan     0.0100   -0.0001
##    460        0.4801             nan     0.0100   -0.0001
##    480        0.4705             nan     0.0100    0.0000
##    500        0.4617             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0043
##      2        1.3011             nan     0.0100    0.0046
##      3        1.2915             nan     0.0100    0.0043
##      4        1.2820             nan     0.0100    0.0040
##      5        1.2720             nan     0.0100    0.0045
##      6        1.2632             nan     0.0100    0.0041
##      7        1.2545             nan     0.0100    0.0040
##      8        1.2452             nan     0.0100    0.0038
##      9        1.2366             nan     0.0100    0.0041
##     10        1.2279             nan     0.0100    0.0038
##     20        1.1490             nan     0.0100    0.0030
##     40        1.0226             nan     0.0100    0.0024
##     60        0.9282             nan     0.0100    0.0016
##     80        0.8528             nan     0.0100    0.0013
##    100        0.7941             nan     0.0100    0.0007
##    120        0.7452             nan     0.0100    0.0008
##    140        0.7043             nan     0.0100    0.0006
##    160        0.6694             nan     0.0100    0.0006
##    180        0.6405             nan     0.0100    0.0002
##    200        0.6155             nan     0.0100    0.0001
##    220        0.5917             nan     0.0100    0.0003
##    240        0.5719             nan     0.0100    0.0000
##    260        0.5530             nan     0.0100    0.0000
##    280        0.5360             nan     0.0100    0.0000
##    300        0.5202             nan     0.0100   -0.0001
##    320        0.5054             nan     0.0100    0.0001
##    340        0.4907             nan     0.0100   -0.0001
##    360        0.4783             nan     0.0100    0.0000
##    380        0.4660             nan     0.0100   -0.0000
##    400        0.4545             nan     0.0100   -0.0000
##    420        0.4438             nan     0.0100    0.0001
##    440        0.4326             nan     0.0100    0.0001
##    460        0.4221             nan     0.0100   -0.0001
##    480        0.4106             nan     0.0100   -0.0002
##    500        0.4016             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0041
##      2        1.3017             nan     0.0100    0.0049
##      3        1.2924             nan     0.0100    0.0045
##      4        1.2832             nan     0.0100    0.0041
##      5        1.2741             nan     0.0100    0.0045
##      6        1.2655             nan     0.0100    0.0037
##      7        1.2563             nan     0.0100    0.0044
##      8        1.2469             nan     0.0100    0.0041
##      9        1.2380             nan     0.0100    0.0044
##     10        1.2293             nan     0.0100    0.0036
##     20        1.1520             nan     0.0100    0.0031
##     40        1.0261             nan     0.0100    0.0024
##     60        0.9310             nan     0.0100    0.0018
##     80        0.8580             nan     0.0100    0.0014
##    100        0.7988             nan     0.0100    0.0010
##    120        0.7502             nan     0.0100    0.0006
##    140        0.7092             nan     0.0100    0.0006
##    160        0.6761             nan     0.0100    0.0003
##    180        0.6467             nan     0.0100    0.0001
##    200        0.6212             nan     0.0100    0.0003
##    220        0.5982             nan     0.0100    0.0001
##    240        0.5767             nan     0.0100    0.0003
##    260        0.5586             nan     0.0100    0.0000
##    280        0.5403             nan     0.0100    0.0001
##    300        0.5238             nan     0.0100    0.0001
##    320        0.5094             nan     0.0100    0.0001
##    340        0.4965             nan     0.0100    0.0000
##    360        0.4830             nan     0.0100   -0.0001
##    380        0.4703             nan     0.0100   -0.0001
##    400        0.4593             nan     0.0100   -0.0000
##    420        0.4490             nan     0.0100   -0.0001
##    440        0.4372             nan     0.0100    0.0000
##    460        0.4280             nan     0.0100   -0.0001
##    480        0.4192             nan     0.0100   -0.0000
##    500        0.4096             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3110             nan     0.0100    0.0047
##      2        1.3017             nan     0.0100    0.0043
##      3        1.2921             nan     0.0100    0.0046
##      4        1.2828             nan     0.0100    0.0043
##      5        1.2743             nan     0.0100    0.0040
##      6        1.2655             nan     0.0100    0.0041
##      7        1.2573             nan     0.0100    0.0038
##      8        1.2482             nan     0.0100    0.0041
##      9        1.2401             nan     0.0100    0.0037
##     10        1.2310             nan     0.0100    0.0044
##     20        1.1524             nan     0.0100    0.0035
##     40        1.0299             nan     0.0100    0.0023
##     60        0.9371             nan     0.0100    0.0017
##     80        0.8650             nan     0.0100    0.0013
##    100        0.8070             nan     0.0100    0.0009
##    120        0.7593             nan     0.0100    0.0005
##    140        0.7193             nan     0.0100    0.0004
##    160        0.6870             nan     0.0100    0.0003
##    180        0.6578             nan     0.0100    0.0005
##    200        0.6327             nan     0.0100    0.0003
##    220        0.6093             nan     0.0100    0.0005
##    240        0.5898             nan     0.0100    0.0000
##    260        0.5718             nan     0.0100    0.0002
##    280        0.5550             nan     0.0100    0.0002
##    300        0.5395             nan     0.0100    0.0000
##    320        0.5254             nan     0.0100   -0.0000
##    340        0.5120             nan     0.0100   -0.0001
##    360        0.4996             nan     0.0100    0.0001
##    380        0.4875             nan     0.0100    0.0000
##    400        0.4755             nan     0.0100   -0.0001
##    420        0.4645             nan     0.0100   -0.0000
##    440        0.4539             nan     0.0100   -0.0001
##    460        0.4441             nan     0.0100   -0.0000
##    480        0.4343             nan     0.0100   -0.0001
##    500        0.4243             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2395             nan     0.1000    0.0380
##      2        1.1673             nan     0.1000    0.0351
##      3        1.0980             nan     0.1000    0.0309
##      4        1.0441             nan     0.1000    0.0253
##      5        0.9992             nan     0.1000    0.0217
##      6        0.9546             nan     0.1000    0.0190
##      7        0.9201             nan     0.1000    0.0151
##      8        0.8905             nan     0.1000    0.0106
##      9        0.8656             nan     0.1000    0.0109
##     10        0.8411             nan     0.1000    0.0100
##     20        0.6848             nan     0.1000    0.0001
##     40        0.5461             nan     0.1000    0.0011
##     60        0.4675             nan     0.1000   -0.0009
##     80        0.4098             nan     0.1000   -0.0008
##    100        0.3573             nan     0.1000    0.0000
##    120        0.3145             nan     0.1000   -0.0011
##    140        0.2825             nan     0.1000   -0.0011
##    160        0.2582             nan     0.1000   -0.0009
##    180        0.2287             nan     0.1000   -0.0007
##    200        0.2035             nan     0.1000   -0.0002
##    220        0.1852             nan     0.1000    0.0002
##    240        0.1650             nan     0.1000    0.0001
##    260        0.1474             nan     0.1000   -0.0001
##    280        0.1345             nan     0.1000   -0.0003
##    300        0.1217             nan     0.1000   -0.0004
##    320        0.1091             nan     0.1000   -0.0002
##    340        0.0997             nan     0.1000   -0.0001
##    360        0.0902             nan     0.1000   -0.0000
##    380        0.0831             nan     0.1000   -0.0002
##    400        0.0771             nan     0.1000   -0.0002
##    420        0.0709             nan     0.1000   -0.0001
##    440        0.0649             nan     0.1000   -0.0003
##    460        0.0593             nan     0.1000   -0.0002
##    480        0.0540             nan     0.1000   -0.0002
##    500        0.0493             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2274             nan     0.1000    0.0435
##      2        1.1525             nan     0.1000    0.0338
##      3        1.0924             nan     0.1000    0.0262
##      4        1.0394             nan     0.1000    0.0218
##      5        0.9977             nan     0.1000    0.0180
##      6        0.9542             nan     0.1000    0.0183
##      7        0.9212             nan     0.1000    0.0142
##      8        0.8916             nan     0.1000    0.0113
##      9        0.8647             nan     0.1000    0.0106
##     10        0.8432             nan     0.1000    0.0076
##     20        0.6875             nan     0.1000    0.0029
##     40        0.5502             nan     0.1000    0.0004
##     60        0.4753             nan     0.1000   -0.0005
##     80        0.4122             nan     0.1000    0.0005
##    100        0.3649             nan     0.1000   -0.0008
##    120        0.3219             nan     0.1000   -0.0005
##    140        0.2866             nan     0.1000    0.0001
##    160        0.2570             nan     0.1000   -0.0008
##    180        0.2315             nan     0.1000   -0.0003
##    200        0.2066             nan     0.1000   -0.0007
##    220        0.1851             nan     0.1000   -0.0002
##    240        0.1686             nan     0.1000   -0.0006
##    260        0.1519             nan     0.1000   -0.0001
##    280        0.1368             nan     0.1000   -0.0001
##    300        0.1270             nan     0.1000   -0.0003
##    320        0.1165             nan     0.1000   -0.0002
##    340        0.1059             nan     0.1000   -0.0004
##    360        0.0962             nan     0.1000   -0.0001
##    380        0.0882             nan     0.1000   -0.0002
##    400        0.0808             nan     0.1000   -0.0002
##    420        0.0743             nan     0.1000   -0.0002
##    440        0.0675             nan     0.1000   -0.0002
##    460        0.0614             nan     0.1000   -0.0002
##    480        0.0562             nan     0.1000   -0.0001
##    500        0.0518             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2348             nan     0.1000    0.0419
##      2        1.1719             nan     0.1000    0.0275
##      3        1.1045             nan     0.1000    0.0277
##      4        1.0521             nan     0.1000    0.0203
##      5        1.0015             nan     0.1000    0.0235
##      6        0.9621             nan     0.1000    0.0163
##      7        0.9296             nan     0.1000    0.0137
##      8        0.8979             nan     0.1000    0.0104
##      9        0.8701             nan     0.1000    0.0122
##     10        0.8481             nan     0.1000    0.0078
##     20        0.6911             nan     0.1000    0.0025
##     40        0.5637             nan     0.1000   -0.0013
##     60        0.4836             nan     0.1000   -0.0013
##     80        0.4253             nan     0.1000   -0.0008
##    100        0.3737             nan     0.1000   -0.0012
##    120        0.3345             nan     0.1000   -0.0005
##    140        0.2953             nan     0.1000   -0.0002
##    160        0.2664             nan     0.1000   -0.0004
##    180        0.2402             nan     0.1000   -0.0003
##    200        0.2146             nan     0.1000   -0.0005
##    220        0.1931             nan     0.1000    0.0001
##    240        0.1763             nan     0.1000   -0.0012
##    260        0.1610             nan     0.1000   -0.0006
##    280        0.1467             nan     0.1000   -0.0001
##    300        0.1344             nan     0.1000   -0.0004
##    320        0.1240             nan     0.1000   -0.0006
##    340        0.1139             nan     0.1000   -0.0003
##    360        0.1040             nan     0.1000   -0.0004
##    380        0.0966             nan     0.1000   -0.0004
##    400        0.0883             nan     0.1000   -0.0002
##    420        0.0806             nan     0.1000   -0.0005
##    440        0.0758             nan     0.1000   -0.0002
##    460        0.0702             nan     0.1000   -0.0002
##    480        0.0650             nan     0.1000   -0.0003
##    500        0.0597             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2292             nan     0.1000    0.0396
##      2        1.1522             nan     0.1000    0.0357
##      3        1.0877             nan     0.1000    0.0292
##      4        1.0292             nan     0.1000    0.0264
##      5        0.9815             nan     0.1000    0.0199
##      6        0.9429             nan     0.1000    0.0153
##      7        0.9058             nan     0.1000    0.0167
##      8        0.8730             nan     0.1000    0.0122
##      9        0.8455             nan     0.1000    0.0094
##     10        0.8191             nan     0.1000    0.0112
##     20        0.6497             nan     0.1000    0.0010
##     40        0.5048             nan     0.1000    0.0003
##     60        0.4124             nan     0.1000   -0.0006
##     80        0.3461             nan     0.1000   -0.0004
##    100        0.2924             nan     0.1000   -0.0006
##    120        0.2489             nan     0.1000    0.0001
##    140        0.2147             nan     0.1000   -0.0008
##    160        0.1885             nan     0.1000   -0.0004
##    180        0.1653             nan     0.1000    0.0001
##    200        0.1481             nan     0.1000   -0.0004
##    220        0.1308             nan     0.1000   -0.0002
##    240        0.1145             nan     0.1000   -0.0003
##    260        0.1010             nan     0.1000   -0.0000
##    280        0.0896             nan     0.1000   -0.0000
##    300        0.0800             nan     0.1000   -0.0002
##    320        0.0716             nan     0.1000   -0.0000
##    340        0.0638             nan     0.1000   -0.0002
##    360        0.0572             nan     0.1000   -0.0001
##    380        0.0512             nan     0.1000    0.0000
##    400        0.0460             nan     0.1000   -0.0000
##    420        0.0412             nan     0.1000   -0.0001
##    440        0.0374             nan     0.1000   -0.0001
##    460        0.0339             nan     0.1000   -0.0002
##    480        0.0306             nan     0.1000   -0.0000
##    500        0.0277             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2337             nan     0.1000    0.0425
##      2        1.1574             nan     0.1000    0.0374
##      3        1.0907             nan     0.1000    0.0288
##      4        1.0341             nan     0.1000    0.0262
##      5        0.9834             nan     0.1000    0.0197
##      6        0.9392             nan     0.1000    0.0185
##      7        0.9062             nan     0.1000    0.0128
##      8        0.8750             nan     0.1000    0.0114
##      9        0.8426             nan     0.1000    0.0113
##     10        0.8163             nan     0.1000    0.0075
##     20        0.6550             nan     0.1000    0.0022
##     40        0.5157             nan     0.1000   -0.0008
##     60        0.4271             nan     0.1000   -0.0006
##     80        0.3599             nan     0.1000   -0.0003
##    100        0.3088             nan     0.1000   -0.0013
##    120        0.2661             nan     0.1000   -0.0004
##    140        0.2263             nan     0.1000   -0.0006
##    160        0.1980             nan     0.1000    0.0003
##    180        0.1737             nan     0.1000   -0.0004
##    200        0.1524             nan     0.1000   -0.0008
##    220        0.1335             nan     0.1000   -0.0003
##    240        0.1188             nan     0.1000   -0.0003
##    260        0.1062             nan     0.1000   -0.0002
##    280        0.0956             nan     0.1000   -0.0005
##    300        0.0849             nan     0.1000   -0.0000
##    320        0.0756             nan     0.1000   -0.0002
##    340        0.0685             nan     0.1000   -0.0005
##    360        0.0607             nan     0.1000   -0.0003
##    380        0.0537             nan     0.1000   -0.0000
##    400        0.0483             nan     0.1000   -0.0003
##    420        0.0428             nan     0.1000   -0.0000
##    440        0.0386             nan     0.1000   -0.0001
##    460        0.0342             nan     0.1000    0.0000
##    480        0.0305             nan     0.1000   -0.0001
##    500        0.0275             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2297             nan     0.1000    0.0401
##      2        1.1540             nan     0.1000    0.0344
##      3        1.0894             nan     0.1000    0.0299
##      4        1.0348             nan     0.1000    0.0241
##      5        0.9838             nan     0.1000    0.0205
##      6        0.9462             nan     0.1000    0.0163
##      7        0.9081             nan     0.1000    0.0149
##      8        0.8754             nan     0.1000    0.0144
##      9        0.8447             nan     0.1000    0.0118
##     10        0.8205             nan     0.1000    0.0082
##     20        0.6599             nan     0.1000    0.0009
##     40        0.5170             nan     0.1000    0.0007
##     60        0.4304             nan     0.1000   -0.0005
##     80        0.3627             nan     0.1000   -0.0005
##    100        0.3120             nan     0.1000   -0.0011
##    120        0.2710             nan     0.1000   -0.0011
##    140        0.2360             nan     0.1000   -0.0007
##    160        0.2071             nan     0.1000   -0.0006
##    180        0.1850             nan     0.1000   -0.0009
##    200        0.1616             nan     0.1000   -0.0004
##    220        0.1429             nan     0.1000   -0.0006
##    240        0.1260             nan     0.1000   -0.0007
##    260        0.1116             nan     0.1000   -0.0003
##    280        0.0999             nan     0.1000   -0.0000
##    300        0.0886             nan     0.1000   -0.0005
##    320        0.0794             nan     0.1000   -0.0002
##    340        0.0716             nan     0.1000   -0.0002
##    360        0.0641             nan     0.1000   -0.0002
##    380        0.0577             nan     0.1000   -0.0001
##    400        0.0514             nan     0.1000   -0.0001
##    420        0.0459             nan     0.1000   -0.0003
##    440        0.0408             nan     0.1000   -0.0001
##    460        0.0372             nan     0.1000   -0.0002
##    480        0.0333             nan     0.1000   -0.0001
##    500        0.0302             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2293             nan     0.1000    0.0461
##      2        1.1488             nan     0.1000    0.0350
##      3        1.0820             nan     0.1000    0.0293
##      4        1.0263             nan     0.1000    0.0224
##      5        0.9736             nan     0.1000    0.0245
##      6        0.9291             nan     0.1000    0.0200
##      7        0.8939             nan     0.1000    0.0143
##      8        0.8558             nan     0.1000    0.0154
##      9        0.8232             nan     0.1000    0.0118
##     10        0.7971             nan     0.1000    0.0084
##     20        0.6222             nan     0.1000    0.0035
##     40        0.4590             nan     0.1000   -0.0004
##     60        0.3666             nan     0.1000   -0.0003
##     80        0.3002             nan     0.1000    0.0002
##    100        0.2490             nan     0.1000   -0.0003
##    120        0.2071             nan     0.1000   -0.0002
##    140        0.1737             nan     0.1000   -0.0003
##    160        0.1480             nan     0.1000   -0.0007
##    180        0.1278             nan     0.1000   -0.0002
##    200        0.1088             nan     0.1000   -0.0002
##    220        0.0939             nan     0.1000   -0.0002
##    240        0.0828             nan     0.1000   -0.0003
##    260        0.0734             nan     0.1000   -0.0003
##    280        0.0644             nan     0.1000   -0.0002
##    300        0.0564             nan     0.1000   -0.0001
##    320        0.0494             nan     0.1000   -0.0000
##    340        0.0438             nan     0.1000   -0.0000
##    360        0.0382             nan     0.1000   -0.0001
##    380        0.0337             nan     0.1000   -0.0001
##    400        0.0297             nan     0.1000   -0.0000
##    420        0.0263             nan     0.1000   -0.0000
##    440        0.0230             nan     0.1000   -0.0000
##    460        0.0199             nan     0.1000   -0.0001
##    480        0.0176             nan     0.1000   -0.0000
##    500        0.0156             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2256             nan     0.1000    0.0466
##      2        1.1447             nan     0.1000    0.0390
##      3        1.0775             nan     0.1000    0.0298
##      4        1.0187             nan     0.1000    0.0242
##      5        0.9669             nan     0.1000    0.0225
##      6        0.9271             nan     0.1000    0.0169
##      7        0.8912             nan     0.1000    0.0138
##      8        0.8590             nan     0.1000    0.0123
##      9        0.8215             nan     0.1000    0.0135
##     10        0.7916             nan     0.1000    0.0098
##     20        0.6172             nan     0.1000    0.0032
##     40        0.4670             nan     0.1000    0.0000
##     60        0.3744             nan     0.1000   -0.0015
##     80        0.2980             nan     0.1000   -0.0004
##    100        0.2519             nan     0.1000   -0.0008
##    120        0.2099             nan     0.1000   -0.0009
##    140        0.1771             nan     0.1000   -0.0006
##    160        0.1490             nan     0.1000   -0.0008
##    180        0.1271             nan     0.1000   -0.0003
##    200        0.1090             nan     0.1000   -0.0006
##    220        0.0946             nan     0.1000   -0.0003
##    240        0.0811             nan     0.1000   -0.0002
##    260        0.0709             nan     0.1000   -0.0002
##    280        0.0620             nan     0.1000   -0.0003
##    300        0.0541             nan     0.1000   -0.0001
##    320        0.0475             nan     0.1000   -0.0001
##    340        0.0423             nan     0.1000   -0.0002
##    360        0.0373             nan     0.1000   -0.0002
##    380        0.0327             nan     0.1000   -0.0001
##    400        0.0288             nan     0.1000   -0.0001
##    420        0.0254             nan     0.1000   -0.0001
##    440        0.0222             nan     0.1000   -0.0000
##    460        0.0195             nan     0.1000   -0.0001
##    480        0.0170             nan     0.1000   -0.0001
##    500        0.0149             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2275             nan     0.1000    0.0399
##      2        1.1516             nan     0.1000    0.0331
##      3        1.0902             nan     0.1000    0.0257
##      4        1.0345             nan     0.1000    0.0246
##      5        0.9809             nan     0.1000    0.0215
##      6        0.9368             nan     0.1000    0.0194
##      7        0.9011             nan     0.1000    0.0153
##      8        0.8678             nan     0.1000    0.0146
##      9        0.8378             nan     0.1000    0.0118
##     10        0.8092             nan     0.1000    0.0112
##     20        0.6364             nan     0.1000    0.0039
##     40        0.4870             nan     0.1000    0.0010
##     60        0.3891             nan     0.1000    0.0000
##     80        0.3211             nan     0.1000   -0.0016
##    100        0.2677             nan     0.1000   -0.0004
##    120        0.2288             nan     0.1000   -0.0010
##    140        0.1946             nan     0.1000   -0.0009
##    160        0.1654             nan     0.1000   -0.0003
##    180        0.1434             nan     0.1000   -0.0004
##    200        0.1256             nan     0.1000   -0.0003
##    220        0.1088             nan     0.1000   -0.0006
##    240        0.0959             nan     0.1000   -0.0006
##    260        0.0825             nan     0.1000   -0.0003
##    280        0.0723             nan     0.1000   -0.0003
##    300        0.0629             nan     0.1000   -0.0002
##    320        0.0561             nan     0.1000   -0.0004
##    340        0.0494             nan     0.1000   -0.0004
##    360        0.0426             nan     0.1000   -0.0001
##    380        0.0380             nan     0.1000   -0.0002
##    400        0.0333             nan     0.1000   -0.0002
##    420        0.0295             nan     0.1000   -0.0000
##    440        0.0257             nan     0.1000   -0.0000
##    460        0.0230             nan     0.1000   -0.0001
##    480        0.0204             nan     0.1000   -0.0000
##    500        0.0179             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3196             nan     0.0010    0.0003
##      3        1.3188             nan     0.0010    0.0004
##      4        1.3179             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3162             nan     0.0010    0.0004
##      7        1.3154             nan     0.0010    0.0003
##      8        1.3146             nan     0.0010    0.0004
##      9        1.3138             nan     0.0010    0.0003
##     10        1.3129             nan     0.0010    0.0004
##     20        1.3048             nan     0.0010    0.0003
##     40        1.2887             nan     0.0010    0.0004
##     60        1.2735             nan     0.0010    0.0003
##     80        1.2586             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2304             nan     0.0010    0.0003
##    140        1.2171             nan     0.0010    0.0003
##    160        1.2040             nan     0.0010    0.0003
##    180        1.1914             nan     0.0010    0.0003
##    200        1.1789             nan     0.0010    0.0003
##    220        1.1669             nan     0.0010    0.0003
##    240        1.1555             nan     0.0010    0.0002
##    260        1.1443             nan     0.0010    0.0002
##    280        1.1335             nan     0.0010    0.0002
##    300        1.1227             nan     0.0010    0.0002
##    320        1.1125             nan     0.0010    0.0002
##    340        1.1026             nan     0.0010    0.0002
##    360        1.0926             nan     0.0010    0.0002
##    380        1.0833             nan     0.0010    0.0002
##    400        1.0739             nan     0.0010    0.0002
##    420        1.0651             nan     0.0010    0.0002
##    440        1.0564             nan     0.0010    0.0002
##    460        1.0477             nan     0.0010    0.0002
##    480        1.0392             nan     0.0010    0.0002
##    500        1.0310             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3169             nan     0.0010    0.0004
##      6        1.3161             nan     0.0010    0.0004
##      7        1.3153             nan     0.0010    0.0003
##      8        1.3145             nan     0.0010    0.0004
##      9        1.3136             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3048             nan     0.0010    0.0004
##     40        1.2891             nan     0.0010    0.0003
##     60        1.2740             nan     0.0010    0.0003
##     80        1.2593             nan     0.0010    0.0003
##    100        1.2451             nan     0.0010    0.0003
##    120        1.2315             nan     0.0010    0.0003
##    140        1.2181             nan     0.0010    0.0003
##    160        1.2053             nan     0.0010    0.0003
##    180        1.1930             nan     0.0010    0.0003
##    200        1.1808             nan     0.0010    0.0003
##    220        1.1688             nan     0.0010    0.0003
##    240        1.1575             nan     0.0010    0.0002
##    260        1.1460             nan     0.0010    0.0002
##    280        1.1352             nan     0.0010    0.0002
##    300        1.1246             nan     0.0010    0.0002
##    320        1.1140             nan     0.0010    0.0003
##    340        1.1038             nan     0.0010    0.0002
##    360        1.0939             nan     0.0010    0.0002
##    380        1.0843             nan     0.0010    0.0002
##    400        1.0750             nan     0.0010    0.0002
##    420        1.0658             nan     0.0010    0.0002
##    440        1.0570             nan     0.0010    0.0002
##    460        1.0482             nan     0.0010    0.0002
##    480        1.0399             nan     0.0010    0.0002
##    500        1.0318             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0003
##      2        1.3196             nan     0.0010    0.0004
##      3        1.3188             nan     0.0010    0.0004
##      4        1.3179             nan     0.0010    0.0004
##      5        1.3171             nan     0.0010    0.0004
##      6        1.3163             nan     0.0010    0.0004
##      7        1.3155             nan     0.0010    0.0004
##      8        1.3147             nan     0.0010    0.0004
##      9        1.3139             nan     0.0010    0.0004
##     10        1.3131             nan     0.0010    0.0004
##     20        1.3049             nan     0.0010    0.0004
##     40        1.2893             nan     0.0010    0.0003
##     60        1.2742             nan     0.0010    0.0004
##     80        1.2595             nan     0.0010    0.0003
##    100        1.2454             nan     0.0010    0.0004
##    120        1.2316             nan     0.0010    0.0003
##    140        1.2182             nan     0.0010    0.0003
##    160        1.2055             nan     0.0010    0.0002
##    180        1.1930             nan     0.0010    0.0002
##    200        1.1808             nan     0.0010    0.0002
##    220        1.1690             nan     0.0010    0.0002
##    240        1.1574             nan     0.0010    0.0003
##    260        1.1460             nan     0.0010    0.0003
##    280        1.1348             nan     0.0010    0.0002
##    300        1.1242             nan     0.0010    0.0002
##    320        1.1138             nan     0.0010    0.0002
##    340        1.1037             nan     0.0010    0.0002
##    360        1.0940             nan     0.0010    0.0002
##    380        1.0844             nan     0.0010    0.0002
##    400        1.0750             nan     0.0010    0.0002
##    420        1.0660             nan     0.0010    0.0002
##    440        1.0573             nan     0.0010    0.0002
##    460        1.0488             nan     0.0010    0.0002
##    480        1.0402             nan     0.0010    0.0002
##    500        1.0321             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0003
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2869             nan     0.0010    0.0004
##     60        1.2705             nan     0.0010    0.0004
##     80        1.2551             nan     0.0010    0.0004
##    100        1.2398             nan     0.0010    0.0004
##    120        1.2246             nan     0.0010    0.0003
##    140        1.2102             nan     0.0010    0.0003
##    160        1.1964             nan     0.0010    0.0003
##    180        1.1831             nan     0.0010    0.0003
##    200        1.1702             nan     0.0010    0.0003
##    220        1.1573             nan     0.0010    0.0002
##    240        1.1448             nan     0.0010    0.0002
##    260        1.1328             nan     0.0010    0.0002
##    280        1.1210             nan     0.0010    0.0003
##    300        1.1095             nan     0.0010    0.0003
##    320        1.0986             nan     0.0010    0.0002
##    340        1.0879             nan     0.0010    0.0002
##    360        1.0776             nan     0.0010    0.0002
##    380        1.0676             nan     0.0010    0.0002
##    400        1.0578             nan     0.0010    0.0002
##    420        1.0483             nan     0.0010    0.0002
##    440        1.0389             nan     0.0010    0.0002
##    460        1.0298             nan     0.0010    0.0002
##    480        1.0208             nan     0.0010    0.0002
##    500        1.0122             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3125             nan     0.0010    0.0004
##     20        1.3039             nan     0.0010    0.0004
##     40        1.2871             nan     0.0010    0.0004
##     60        1.2707             nan     0.0010    0.0004
##     80        1.2551             nan     0.0010    0.0004
##    100        1.2397             nan     0.0010    0.0003
##    120        1.2253             nan     0.0010    0.0003
##    140        1.2110             nan     0.0010    0.0003
##    160        1.1973             nan     0.0010    0.0003
##    180        1.1840             nan     0.0010    0.0003
##    200        1.1709             nan     0.0010    0.0003
##    220        1.1586             nan     0.0010    0.0003
##    240        1.1464             nan     0.0010    0.0003
##    260        1.1345             nan     0.0010    0.0002
##    280        1.1228             nan     0.0010    0.0003
##    300        1.1115             nan     0.0010    0.0003
##    320        1.1006             nan     0.0010    0.0002
##    340        1.0898             nan     0.0010    0.0002
##    360        1.0796             nan     0.0010    0.0002
##    380        1.0696             nan     0.0010    0.0002
##    400        1.0598             nan     0.0010    0.0002
##    420        1.0503             nan     0.0010    0.0002
##    440        1.0411             nan     0.0010    0.0002
##    460        1.0318             nan     0.0010    0.0002
##    480        1.0227             nan     0.0010    0.0002
##    500        1.0142             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2873             nan     0.0010    0.0004
##     60        1.2717             nan     0.0010    0.0004
##     80        1.2560             nan     0.0010    0.0003
##    100        1.2412             nan     0.0010    0.0003
##    120        1.2269             nan     0.0010    0.0003
##    140        1.2128             nan     0.0010    0.0003
##    160        1.1990             nan     0.0010    0.0003
##    180        1.1860             nan     0.0010    0.0003
##    200        1.1730             nan     0.0010    0.0003
##    220        1.1606             nan     0.0010    0.0003
##    240        1.1483             nan     0.0010    0.0003
##    260        1.1361             nan     0.0010    0.0003
##    280        1.1243             nan     0.0010    0.0003
##    300        1.1133             nan     0.0010    0.0002
##    320        1.1026             nan     0.0010    0.0002
##    340        1.0922             nan     0.0010    0.0002
##    360        1.0821             nan     0.0010    0.0002
##    380        1.0721             nan     0.0010    0.0002
##    400        1.0623             nan     0.0010    0.0002
##    420        1.0529             nan     0.0010    0.0002
##    440        1.0438             nan     0.0010    0.0002
##    460        1.0348             nan     0.0010    0.0002
##    480        1.0260             nan     0.0010    0.0002
##    500        1.0175             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0005
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2855             nan     0.0010    0.0003
##     60        1.2684             nan     0.0010    0.0003
##     80        1.2519             nan     0.0010    0.0003
##    100        1.2359             nan     0.0010    0.0004
##    120        1.2206             nan     0.0010    0.0003
##    140        1.2058             nan     0.0010    0.0003
##    160        1.1914             nan     0.0010    0.0003
##    180        1.1775             nan     0.0010    0.0002
##    200        1.1638             nan     0.0010    0.0003
##    220        1.1509             nan     0.0010    0.0003
##    240        1.1382             nan     0.0010    0.0003
##    260        1.1257             nan     0.0010    0.0003
##    280        1.1136             nan     0.0010    0.0003
##    300        1.1017             nan     0.0010    0.0002
##    320        1.0903             nan     0.0010    0.0003
##    340        1.0790             nan     0.0010    0.0002
##    360        1.0682             nan     0.0010    0.0002
##    380        1.0575             nan     0.0010    0.0002
##    400        1.0471             nan     0.0010    0.0002
##    420        1.0371             nan     0.0010    0.0002
##    440        1.0272             nan     0.0010    0.0002
##    460        1.0177             nan     0.0010    0.0002
##    480        1.0086             nan     0.0010    0.0002
##    500        0.9997             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0005
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2855             nan     0.0010    0.0004
##     60        1.2685             nan     0.0010    0.0004
##     80        1.2521             nan     0.0010    0.0003
##    100        1.2361             nan     0.0010    0.0003
##    120        1.2206             nan     0.0010    0.0004
##    140        1.2055             nan     0.0010    0.0003
##    160        1.1911             nan     0.0010    0.0003
##    180        1.1773             nan     0.0010    0.0003
##    200        1.1637             nan     0.0010    0.0003
##    220        1.1503             nan     0.0010    0.0003
##    240        1.1375             nan     0.0010    0.0002
##    260        1.1253             nan     0.0010    0.0003
##    280        1.1132             nan     0.0010    0.0003
##    300        1.1016             nan     0.0010    0.0002
##    320        1.0902             nan     0.0010    0.0003
##    340        1.0794             nan     0.0010    0.0002
##    360        1.0688             nan     0.0010    0.0002
##    380        1.0583             nan     0.0010    0.0002
##    400        1.0480             nan     0.0010    0.0002
##    420        1.0382             nan     0.0010    0.0002
##    440        1.0286             nan     0.0010    0.0002
##    460        1.0190             nan     0.0010    0.0002
##    480        1.0098             nan     0.0010    0.0002
##    500        1.0009             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0003
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2862             nan     0.0010    0.0004
##     60        1.2694             nan     0.0010    0.0004
##     80        1.2532             nan     0.0010    0.0004
##    100        1.2378             nan     0.0010    0.0003
##    120        1.2228             nan     0.0010    0.0003
##    140        1.2079             nan     0.0010    0.0003
##    160        1.1938             nan     0.0010    0.0003
##    180        1.1800             nan     0.0010    0.0003
##    200        1.1667             nan     0.0010    0.0003
##    220        1.1538             nan     0.0010    0.0003
##    240        1.1413             nan     0.0010    0.0003
##    260        1.1289             nan     0.0010    0.0003
##    280        1.1167             nan     0.0010    0.0003
##    300        1.1051             nan     0.0010    0.0003
##    320        1.0936             nan     0.0010    0.0003
##    340        1.0827             nan     0.0010    0.0003
##    360        1.0720             nan     0.0010    0.0002
##    380        1.0617             nan     0.0010    0.0002
##    400        1.0516             nan     0.0010    0.0002
##    420        1.0418             nan     0.0010    0.0002
##    440        1.0322             nan     0.0010    0.0002
##    460        1.0228             nan     0.0010    0.0002
##    480        1.0135             nan     0.0010    0.0002
##    500        1.0048             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0041
##      2        1.3043             nan     0.0100    0.0036
##      3        1.2965             nan     0.0100    0.0034
##      4        1.2892             nan     0.0100    0.0033
##      5        1.2812             nan     0.0100    0.0038
##      6        1.2743             nan     0.0100    0.0028
##      7        1.2668             nan     0.0100    0.0034
##      8        1.2593             nan     0.0100    0.0037
##      9        1.2515             nan     0.0100    0.0031
##     10        1.2439             nan     0.0100    0.0031
##     20        1.1793             nan     0.0100    0.0026
##     40        1.0729             nan     0.0100    0.0020
##     60        0.9926             nan     0.0100    0.0013
##     80        0.9294             nan     0.0100    0.0011
##    100        0.8778             nan     0.0100    0.0008
##    120        0.8355             nan     0.0100    0.0006
##    140        0.7990             nan     0.0100    0.0006
##    160        0.7697             nan     0.0100    0.0004
##    180        0.7434             nan     0.0100    0.0003
##    200        0.7216             nan     0.0100    0.0002
##    220        0.7022             nan     0.0100    0.0002
##    240        0.6848             nan     0.0100    0.0001
##    260        0.6688             nan     0.0100    0.0001
##    280        0.6538             nan     0.0100    0.0001
##    300        0.6413             nan     0.0100    0.0001
##    320        0.6281             nan     0.0100    0.0000
##    340        0.6162             nan     0.0100   -0.0000
##    360        0.6047             nan     0.0100    0.0000
##    380        0.5939             nan     0.0100    0.0000
##    400        0.5834             nan     0.0100    0.0001
##    420        0.5740             nan     0.0100   -0.0001
##    440        0.5644             nan     0.0100   -0.0001
##    460        0.5556             nan     0.0100   -0.0001
##    480        0.5462             nan     0.0100   -0.0001
##    500        0.5388             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3126             nan     0.0100    0.0038
##      2        1.3037             nan     0.0100    0.0038
##      3        1.2961             nan     0.0100    0.0038
##      4        1.2884             nan     0.0100    0.0036
##      5        1.2812             nan     0.0100    0.0033
##      6        1.2734             nan     0.0100    0.0032
##      7        1.2661             nan     0.0100    0.0030
##      8        1.2590             nan     0.0100    0.0035
##      9        1.2517             nan     0.0100    0.0031
##     10        1.2443             nan     0.0100    0.0029
##     20        1.1763             nan     0.0100    0.0026
##     40        1.0713             nan     0.0100    0.0019
##     60        0.9912             nan     0.0100    0.0016
##     80        0.9280             nan     0.0100    0.0011
##    100        0.8773             nan     0.0100    0.0007
##    120        0.8346             nan     0.0100    0.0008
##    140        0.8007             nan     0.0100    0.0006
##    160        0.7701             nan     0.0100    0.0005
##    180        0.7458             nan     0.0100    0.0003
##    200        0.7244             nan     0.0100    0.0003
##    220        0.7053             nan     0.0100    0.0003
##    240        0.6886             nan     0.0100    0.0002
##    260        0.6731             nan     0.0100    0.0001
##    280        0.6578             nan     0.0100    0.0001
##    300        0.6439             nan     0.0100   -0.0000
##    320        0.6316             nan     0.0100    0.0001
##    340        0.6208             nan     0.0100   -0.0001
##    360        0.6100             nan     0.0100    0.0001
##    380        0.5991             nan     0.0100   -0.0002
##    400        0.5895             nan     0.0100   -0.0000
##    420        0.5799             nan     0.0100    0.0001
##    440        0.5709             nan     0.0100    0.0000
##    460        0.5620             nan     0.0100    0.0001
##    480        0.5536             nan     0.0100   -0.0002
##    500        0.5447             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3130             nan     0.0100    0.0038
##      2        1.3048             nan     0.0100    0.0038
##      3        1.2961             nan     0.0100    0.0040
##      4        1.2876             nan     0.0100    0.0036
##      5        1.2795             nan     0.0100    0.0037
##      6        1.2724             nan     0.0100    0.0032
##      7        1.2654             nan     0.0100    0.0031
##      8        1.2585             nan     0.0100    0.0029
##      9        1.2518             nan     0.0100    0.0030
##     10        1.2447             nan     0.0100    0.0032
##     20        1.1785             nan     0.0100    0.0028
##     40        1.0727             nan     0.0100    0.0016
##     60        0.9918             nan     0.0100    0.0016
##     80        0.9299             nan     0.0100    0.0011
##    100        0.8798             nan     0.0100    0.0007
##    120        0.8375             nan     0.0100    0.0009
##    140        0.8014             nan     0.0100    0.0005
##    160        0.7725             nan     0.0100    0.0003
##    180        0.7469             nan     0.0100    0.0001
##    200        0.7267             nan     0.0100    0.0002
##    220        0.7077             nan     0.0100    0.0002
##    240        0.6918             nan     0.0100    0.0000
##    260        0.6776             nan     0.0100    0.0003
##    280        0.6637             nan     0.0100    0.0001
##    300        0.6506             nan     0.0100   -0.0000
##    320        0.6389             nan     0.0100   -0.0001
##    340        0.6271             nan     0.0100   -0.0000
##    360        0.6162             nan     0.0100    0.0001
##    380        0.6064             nan     0.0100   -0.0001
##    400        0.5973             nan     0.0100    0.0000
##    420        0.5881             nan     0.0100   -0.0000
##    440        0.5795             nan     0.0100   -0.0001
##    460        0.5706             nan     0.0100   -0.0001
##    480        0.5626             nan     0.0100   -0.0001
##    500        0.5550             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0039
##      2        1.3031             nan     0.0100    0.0041
##      3        1.2946             nan     0.0100    0.0035
##      4        1.2860             nan     0.0100    0.0035
##      5        1.2779             nan     0.0100    0.0038
##      6        1.2696             nan     0.0100    0.0034
##      7        1.2614             nan     0.0100    0.0035
##      8        1.2540             nan     0.0100    0.0031
##      9        1.2463             nan     0.0100    0.0033
##     10        1.2391             nan     0.0100    0.0031
##     20        1.1704             nan     0.0100    0.0030
##     40        1.0566             nan     0.0100    0.0020
##     60        0.9691             nan     0.0100    0.0018
##     80        0.9043             nan     0.0100    0.0013
##    100        0.8500             nan     0.0100    0.0009
##    120        0.8071             nan     0.0100    0.0005
##    140        0.7700             nan     0.0100    0.0006
##    160        0.7387             nan     0.0100    0.0004
##    180        0.7122             nan     0.0100    0.0004
##    200        0.6879             nan     0.0100    0.0001
##    220        0.6667             nan     0.0100    0.0002
##    240        0.6473             nan     0.0100    0.0002
##    260        0.6304             nan     0.0100    0.0001
##    280        0.6150             nan     0.0100   -0.0003
##    300        0.5998             nan     0.0100    0.0001
##    320        0.5858             nan     0.0100    0.0000
##    340        0.5728             nan     0.0100    0.0001
##    360        0.5601             nan     0.0100    0.0000
##    380        0.5474             nan     0.0100   -0.0000
##    400        0.5365             nan     0.0100    0.0002
##    420        0.5258             nan     0.0100    0.0001
##    440        0.5154             nan     0.0100    0.0000
##    460        0.5062             nan     0.0100   -0.0001
##    480        0.4972             nan     0.0100   -0.0001
##    500        0.4875             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0041
##      2        1.3027             nan     0.0100    0.0042
##      3        1.2941             nan     0.0100    0.0036
##      4        1.2861             nan     0.0100    0.0035
##      5        1.2779             nan     0.0100    0.0036
##      6        1.2696             nan     0.0100    0.0037
##      7        1.2620             nan     0.0100    0.0034
##      8        1.2542             nan     0.0100    0.0031
##      9        1.2464             nan     0.0100    0.0035
##     10        1.2391             nan     0.0100    0.0033
##     20        1.1709             nan     0.0100    0.0029
##     40        1.0586             nan     0.0100    0.0022
##     60        0.9755             nan     0.0100    0.0014
##     80        0.9072             nan     0.0100    0.0007
##    100        0.8542             nan     0.0100    0.0011
##    120        0.8098             nan     0.0100    0.0006
##    140        0.7729             nan     0.0100    0.0008
##    160        0.7424             nan     0.0100    0.0003
##    180        0.7162             nan     0.0100    0.0004
##    200        0.6923             nan     0.0100   -0.0002
##    220        0.6716             nan     0.0100    0.0002
##    240        0.6537             nan     0.0100    0.0002
##    260        0.6367             nan     0.0100    0.0001
##    280        0.6223             nan     0.0100    0.0001
##    300        0.6075             nan     0.0100    0.0001
##    320        0.5929             nan     0.0100   -0.0000
##    340        0.5792             nan     0.0100    0.0000
##    360        0.5670             nan     0.0100   -0.0001
##    380        0.5552             nan     0.0100   -0.0001
##    400        0.5440             nan     0.0100   -0.0001
##    420        0.5326             nan     0.0100    0.0003
##    440        0.5232             nan     0.0100   -0.0002
##    460        0.5128             nan     0.0100   -0.0000
##    480        0.5031             nan     0.0100    0.0000
##    500        0.4946             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0039
##      2        1.3041             nan     0.0100    0.0034
##      3        1.2954             nan     0.0100    0.0035
##      4        1.2876             nan     0.0100    0.0035
##      5        1.2790             nan     0.0100    0.0040
##      6        1.2715             nan     0.0100    0.0031
##      7        1.2636             nan     0.0100    0.0035
##      8        1.2556             nan     0.0100    0.0033
##      9        1.2484             nan     0.0100    0.0031
##     10        1.2411             nan     0.0100    0.0029
##     20        1.1723             nan     0.0100    0.0029
##     40        1.0607             nan     0.0100    0.0020
##     60        0.9751             nan     0.0100    0.0015
##     80        0.9086             nan     0.0100    0.0012
##    100        0.8548             nan     0.0100    0.0006
##    120        0.8123             nan     0.0100    0.0007
##    140        0.7765             nan     0.0100    0.0006
##    160        0.7453             nan     0.0100    0.0005
##    180        0.7199             nan     0.0100    0.0003
##    200        0.6964             nan     0.0100    0.0002
##    220        0.6753             nan     0.0100    0.0001
##    240        0.6564             nan     0.0100    0.0002
##    260        0.6409             nan     0.0100   -0.0001
##    280        0.6243             nan     0.0100    0.0001
##    300        0.6105             nan     0.0100    0.0001
##    320        0.5969             nan     0.0100    0.0001
##    340        0.5860             nan     0.0100    0.0001
##    360        0.5749             nan     0.0100    0.0000
##    380        0.5630             nan     0.0100    0.0001
##    400        0.5519             nan     0.0100    0.0001
##    420        0.5413             nan     0.0100    0.0002
##    440        0.5314             nan     0.0100    0.0001
##    460        0.5212             nan     0.0100   -0.0001
##    480        0.5121             nan     0.0100   -0.0000
##    500        0.5022             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0041
##      2        1.3028             nan     0.0100    0.0036
##      3        1.2934             nan     0.0100    0.0042
##      4        1.2848             nan     0.0100    0.0037
##      5        1.2762             nan     0.0100    0.0039
##      6        1.2674             nan     0.0100    0.0038
##      7        1.2591             nan     0.0100    0.0035
##      8        1.2511             nan     0.0100    0.0034
##      9        1.2433             nan     0.0100    0.0034
##     10        1.2353             nan     0.0100    0.0035
##     20        1.1624             nan     0.0100    0.0027
##     40        1.0462             nan     0.0100    0.0024
##     60        0.9582             nan     0.0100    0.0014
##     80        0.8870             nan     0.0100    0.0011
##    100        0.8298             nan     0.0100    0.0008
##    120        0.7828             nan     0.0100    0.0008
##    140        0.7441             nan     0.0100    0.0004
##    160        0.7102             nan     0.0100    0.0005
##    180        0.6813             nan     0.0100    0.0002
##    200        0.6558             nan     0.0100    0.0001
##    220        0.6336             nan     0.0100    0.0001
##    240        0.6123             nan     0.0100    0.0002
##    260        0.5934             nan     0.0100    0.0000
##    280        0.5764             nan     0.0100    0.0001
##    300        0.5593             nan     0.0100    0.0001
##    320        0.5443             nan     0.0100   -0.0001
##    340        0.5302             nan     0.0100    0.0001
##    360        0.5164             nan     0.0100    0.0001
##    380        0.5045             nan     0.0100    0.0001
##    400        0.4934             nan     0.0100   -0.0001
##    420        0.4817             nan     0.0100    0.0000
##    440        0.4707             nan     0.0100    0.0001
##    460        0.4609             nan     0.0100    0.0001
##    480        0.4508             nan     0.0100   -0.0000
##    500        0.4418             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0039
##      2        1.3039             nan     0.0100    0.0040
##      3        1.2946             nan     0.0100    0.0045
##      4        1.2849             nan     0.0100    0.0041
##      5        1.2759             nan     0.0100    0.0040
##      6        1.2674             nan     0.0100    0.0038
##      7        1.2590             nan     0.0100    0.0039
##      8        1.2510             nan     0.0100    0.0033
##      9        1.2429             nan     0.0100    0.0038
##     10        1.2355             nan     0.0100    0.0032
##     20        1.1629             nan     0.0100    0.0030
##     40        1.0474             nan     0.0100    0.0021
##     60        0.9585             nan     0.0100    0.0015
##     80        0.8876             nan     0.0100    0.0011
##    100        0.8330             nan     0.0100    0.0009
##    120        0.7874             nan     0.0100    0.0007
##    140        0.7498             nan     0.0100    0.0005
##    160        0.7175             nan     0.0100    0.0004
##    180        0.6887             nan     0.0100    0.0002
##    200        0.6630             nan     0.0100    0.0002
##    220        0.6416             nan     0.0100    0.0001
##    240        0.6223             nan     0.0100   -0.0001
##    260        0.6028             nan     0.0100    0.0002
##    280        0.5860             nan     0.0100    0.0002
##    300        0.5698             nan     0.0100    0.0001
##    320        0.5552             nan     0.0100    0.0002
##    340        0.5407             nan     0.0100    0.0000
##    360        0.5279             nan     0.0100    0.0000
##    380        0.5161             nan     0.0100   -0.0001
##    400        0.5045             nan     0.0100   -0.0001
##    420        0.4925             nan     0.0100    0.0001
##    440        0.4821             nan     0.0100   -0.0002
##    460        0.4715             nan     0.0100   -0.0000
##    480        0.4617             nan     0.0100   -0.0001
##    500        0.4511             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0039
##      2        1.3036             nan     0.0100    0.0035
##      3        1.2950             nan     0.0100    0.0043
##      4        1.2860             nan     0.0100    0.0041
##      5        1.2777             nan     0.0100    0.0039
##      6        1.2694             nan     0.0100    0.0036
##      7        1.2604             nan     0.0100    0.0040
##      8        1.2527             nan     0.0100    0.0034
##      9        1.2454             nan     0.0100    0.0032
##     10        1.2383             nan     0.0100    0.0034
##     20        1.1685             nan     0.0100    0.0025
##     40        1.0529             nan     0.0100    0.0022
##     60        0.9639             nan     0.0100    0.0016
##     80        0.8954             nan     0.0100    0.0012
##    100        0.8410             nan     0.0100    0.0009
##    120        0.7926             nan     0.0100    0.0009
##    140        0.7562             nan     0.0100    0.0008
##    160        0.7249             nan     0.0100    0.0005
##    180        0.6976             nan     0.0100    0.0003
##    200        0.6732             nan     0.0100    0.0003
##    220        0.6512             nan     0.0100    0.0000
##    240        0.6313             nan     0.0100    0.0002
##    260        0.6129             nan     0.0100    0.0000
##    280        0.5961             nan     0.0100    0.0003
##    300        0.5809             nan     0.0100    0.0001
##    320        0.5670             nan     0.0100    0.0001
##    340        0.5545             nan     0.0100    0.0000
##    360        0.5417             nan     0.0100   -0.0001
##    380        0.5293             nan     0.0100    0.0001
##    400        0.5175             nan     0.0100   -0.0002
##    420        0.5068             nan     0.0100   -0.0000
##    440        0.4966             nan     0.0100    0.0001
##    460        0.4866             nan     0.0100    0.0001
##    480        0.4770             nan     0.0100   -0.0002
##    500        0.4683             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2358             nan     0.1000    0.0351
##      2        1.1721             nan     0.1000    0.0326
##      3        1.1215             nan     0.1000    0.0251
##      4        1.0710             nan     0.1000    0.0240
##      5        1.0276             nan     0.1000    0.0167
##      6        0.9915             nan     0.1000    0.0154
##      7        0.9602             nan     0.1000    0.0142
##      8        0.9274             nan     0.1000    0.0128
##      9        0.9027             nan     0.1000    0.0100
##     10        0.8817             nan     0.1000    0.0068
##     20        0.7310             nan     0.1000    0.0004
##     40        0.5952             nan     0.1000    0.0018
##     60        0.5108             nan     0.1000    0.0006
##     80        0.4454             nan     0.1000   -0.0002
##    100        0.3963             nan     0.1000   -0.0004
##    120        0.3522             nan     0.1000    0.0001
##    140        0.3156             nan     0.1000   -0.0002
##    160        0.2874             nan     0.1000   -0.0013
##    180        0.2574             nan     0.1000   -0.0007
##    200        0.2339             nan     0.1000   -0.0008
##    220        0.2108             nan     0.1000   -0.0005
##    240        0.1919             nan     0.1000   -0.0000
##    260        0.1750             nan     0.1000   -0.0001
##    280        0.1591             nan     0.1000   -0.0000
##    300        0.1461             nan     0.1000   -0.0005
##    320        0.1349             nan     0.1000   -0.0000
##    340        0.1243             nan     0.1000   -0.0000
##    360        0.1135             nan     0.1000   -0.0001
##    380        0.1045             nan     0.1000   -0.0004
##    400        0.0972             nan     0.1000   -0.0001
##    420        0.0892             nan     0.1000   -0.0002
##    440        0.0822             nan     0.1000   -0.0002
##    460        0.0754             nan     0.1000   -0.0001
##    480        0.0692             nan     0.1000   -0.0000
##    500        0.0642             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2456             nan     0.1000    0.0342
##      2        1.1787             nan     0.1000    0.0306
##      3        1.1188             nan     0.1000    0.0276
##      4        1.0739             nan     0.1000    0.0175
##      5        1.0296             nan     0.1000    0.0192
##      6        0.9899             nan     0.1000    0.0164
##      7        0.9560             nan     0.1000    0.0141
##      8        0.9268             nan     0.1000    0.0111
##      9        0.8987             nan     0.1000    0.0129
##     10        0.8775             nan     0.1000    0.0080
##     20        0.7242             nan     0.1000    0.0008
##     40        0.5901             nan     0.1000    0.0015
##     60        0.5084             nan     0.1000   -0.0002
##     80        0.4478             nan     0.1000   -0.0008
##    100        0.3931             nan     0.1000   -0.0015
##    120        0.3543             nan     0.1000   -0.0001
##    140        0.3128             nan     0.1000   -0.0008
##    160        0.2826             nan     0.1000   -0.0003
##    180        0.2567             nan     0.1000   -0.0009
##    200        0.2328             nan     0.1000    0.0002
##    220        0.2134             nan     0.1000   -0.0001
##    240        0.1934             nan     0.1000   -0.0003
##    260        0.1782             nan     0.1000   -0.0004
##    280        0.1611             nan     0.1000   -0.0003
##    300        0.1487             nan     0.1000   -0.0001
##    320        0.1345             nan     0.1000   -0.0002
##    340        0.1233             nan     0.1000   -0.0005
##    360        0.1132             nan     0.1000   -0.0005
##    380        0.1034             nan     0.1000   -0.0003
##    400        0.0948             nan     0.1000   -0.0002
##    420        0.0865             nan     0.1000   -0.0001
##    440        0.0799             nan     0.1000   -0.0001
##    460        0.0746             nan     0.1000   -0.0002
##    480        0.0691             nan     0.1000   -0.0002
##    500        0.0641             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2392             nan     0.1000    0.0338
##      2        1.1772             nan     0.1000    0.0289
##      3        1.1188             nan     0.1000    0.0242
##      4        1.0690             nan     0.1000    0.0201
##      5        1.0274             nan     0.1000    0.0172
##      6        0.9891             nan     0.1000    0.0146
##      7        0.9611             nan     0.1000    0.0108
##      8        0.9276             nan     0.1000    0.0133
##      9        0.8999             nan     0.1000    0.0107
##     10        0.8758             nan     0.1000    0.0086
##     20        0.7266             nan     0.1000    0.0034
##     40        0.5911             nan     0.1000   -0.0003
##     60        0.5180             nan     0.1000   -0.0023
##     80        0.4554             nan     0.1000    0.0001
##    100        0.4065             nan     0.1000   -0.0014
##    120        0.3608             nan     0.1000   -0.0010
##    140        0.3246             nan     0.1000   -0.0003
##    160        0.2956             nan     0.1000   -0.0011
##    180        0.2648             nan     0.1000   -0.0017
##    200        0.2383             nan     0.1000   -0.0011
##    220        0.2173             nan     0.1000   -0.0005
##    240        0.1999             nan     0.1000   -0.0012
##    260        0.1847             nan     0.1000   -0.0003
##    280        0.1688             nan     0.1000    0.0001
##    300        0.1544             nan     0.1000   -0.0009
##    320        0.1421             nan     0.1000   -0.0005
##    340        0.1323             nan     0.1000   -0.0006
##    360        0.1228             nan     0.1000   -0.0001
##    380        0.1149             nan     0.1000   -0.0004
##    400        0.1070             nan     0.1000   -0.0006
##    420        0.0988             nan     0.1000   -0.0005
##    440        0.0914             nan     0.1000   -0.0004
##    460        0.0839             nan     0.1000   -0.0003
##    480        0.0772             nan     0.1000   -0.0001
##    500        0.0715             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2344             nan     0.1000    0.0391
##      2        1.1663             nan     0.1000    0.0317
##      3        1.1084             nan     0.1000    0.0233
##      4        1.0592             nan     0.1000    0.0195
##      5        1.0149             nan     0.1000    0.0164
##      6        0.9768             nan     0.1000    0.0161
##      7        0.9421             nan     0.1000    0.0130
##      8        0.9090             nan     0.1000    0.0120
##      9        0.8806             nan     0.1000    0.0087
##     10        0.8567             nan     0.1000    0.0070
##     20        0.6846             nan     0.1000    0.0036
##     40        0.5441             nan     0.1000    0.0007
##     60        0.4508             nan     0.1000    0.0007
##     80        0.3771             nan     0.1000   -0.0012
##    100        0.3274             nan     0.1000    0.0008
##    120        0.2859             nan     0.1000   -0.0002
##    140        0.2516             nan     0.1000   -0.0010
##    160        0.2189             nan     0.1000   -0.0010
##    180        0.1943             nan     0.1000   -0.0005
##    200        0.1727             nan     0.1000   -0.0003
##    220        0.1519             nan     0.1000    0.0000
##    240        0.1345             nan     0.1000   -0.0001
##    260        0.1211             nan     0.1000   -0.0004
##    280        0.1083             nan     0.1000   -0.0002
##    300        0.0967             nan     0.1000   -0.0001
##    320        0.0867             nan     0.1000   -0.0001
##    340        0.0781             nan     0.1000   -0.0005
##    360        0.0709             nan     0.1000   -0.0004
##    380        0.0640             nan     0.1000   -0.0001
##    400        0.0582             nan     0.1000   -0.0003
##    420        0.0528             nan     0.1000   -0.0000
##    440        0.0482             nan     0.1000   -0.0001
##    460        0.0433             nan     0.1000   -0.0000
##    480        0.0392             nan     0.1000   -0.0002
##    500        0.0356             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2365             nan     0.1000    0.0364
##      2        1.1677             nan     0.1000    0.0318
##      3        1.1084             nan     0.1000    0.0275
##      4        1.0541             nan     0.1000    0.0215
##      5        1.0095             nan     0.1000    0.0181
##      6        0.9688             nan     0.1000    0.0145
##      7        0.9325             nan     0.1000    0.0170
##      8        0.8998             nan     0.1000    0.0152
##      9        0.8723             nan     0.1000    0.0089
##     10        0.8443             nan     0.1000    0.0117
##     20        0.6886             nan     0.1000    0.0006
##     40        0.5434             nan     0.1000   -0.0000
##     60        0.4573             nan     0.1000   -0.0001
##     80        0.3887             nan     0.1000   -0.0003
##    100        0.3354             nan     0.1000   -0.0005
##    120        0.2908             nan     0.1000   -0.0007
##    140        0.2574             nan     0.1000   -0.0008
##    160        0.2263             nan     0.1000   -0.0007
##    180        0.1987             nan     0.1000   -0.0010
##    200        0.1758             nan     0.1000   -0.0010
##    220        0.1552             nan     0.1000   -0.0006
##    240        0.1358             nan     0.1000    0.0001
##    260        0.1226             nan     0.1000   -0.0000
##    280        0.1085             nan     0.1000   -0.0004
##    300        0.0978             nan     0.1000   -0.0002
##    320        0.0890             nan     0.1000   -0.0006
##    340        0.0814             nan     0.1000   -0.0001
##    360        0.0731             nan     0.1000   -0.0002
##    380        0.0661             nan     0.1000   -0.0002
##    400        0.0600             nan     0.1000   -0.0003
##    420        0.0548             nan     0.1000   -0.0001
##    440        0.0491             nan     0.1000   -0.0003
##    460        0.0441             nan     0.1000    0.0000
##    480        0.0402             nan     0.1000   -0.0002
##    500        0.0361             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2318             nan     0.1000    0.0407
##      2        1.1613             nan     0.1000    0.0310
##      3        1.1051             nan     0.1000    0.0272
##      4        1.0503             nan     0.1000    0.0237
##      5        1.0038             nan     0.1000    0.0191
##      6        0.9637             nan     0.1000    0.0145
##      7        0.9318             nan     0.1000    0.0140
##      8        0.8990             nan     0.1000    0.0148
##      9        0.8759             nan     0.1000    0.0076
##     10        0.8523             nan     0.1000    0.0094
##     20        0.7075             nan     0.1000    0.0007
##     40        0.5638             nan     0.1000   -0.0001
##     60        0.4784             nan     0.1000    0.0004
##     80        0.4104             nan     0.1000   -0.0011
##    100        0.3529             nan     0.1000   -0.0009
##    120        0.3083             nan     0.1000   -0.0006
##    140        0.2743             nan     0.1000   -0.0012
##    160        0.2436             nan     0.1000   -0.0010
##    180        0.2199             nan     0.1000   -0.0000
##    200        0.1953             nan     0.1000   -0.0011
##    220        0.1744             nan     0.1000   -0.0003
##    240        0.1573             nan     0.1000   -0.0004
##    260        0.1402             nan     0.1000   -0.0008
##    280        0.1257             nan     0.1000   -0.0001
##    300        0.1128             nan     0.1000   -0.0002
##    320        0.1028             nan     0.1000   -0.0004
##    340        0.0928             nan     0.1000   -0.0004
##    360        0.0835             nan     0.1000   -0.0004
##    380        0.0757             nan     0.1000   -0.0004
##    400        0.0688             nan     0.1000   -0.0001
##    420        0.0623             nan     0.1000   -0.0002
##    440        0.0571             nan     0.1000   -0.0002
##    460        0.0525             nan     0.1000   -0.0001
##    480        0.0477             nan     0.1000   -0.0002
##    500        0.0439             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2383             nan     0.1000    0.0387
##      2        1.1579             nan     0.1000    0.0360
##      3        1.0965             nan     0.1000    0.0285
##      4        1.0450             nan     0.1000    0.0221
##      5        0.9962             nan     0.1000    0.0205
##      6        0.9571             nan     0.1000    0.0152
##      7        0.9173             nan     0.1000    0.0166
##      8        0.8795             nan     0.1000    0.0130
##      9        0.8487             nan     0.1000    0.0117
##     10        0.8193             nan     0.1000    0.0100
##     20        0.6546             nan     0.1000    0.0034
##     40        0.4944             nan     0.1000    0.0020
##     60        0.3997             nan     0.1000   -0.0004
##     80        0.3364             nan     0.1000   -0.0002
##    100        0.2802             nan     0.1000   -0.0005
##    120        0.2370             nan     0.1000   -0.0005
##    140        0.2057             nan     0.1000   -0.0011
##    160        0.1753             nan     0.1000   -0.0000
##    180        0.1527             nan     0.1000   -0.0009
##    200        0.1308             nan     0.1000   -0.0002
##    220        0.1142             nan     0.1000   -0.0004
##    240        0.1005             nan     0.1000   -0.0001
##    260        0.0892             nan     0.1000   -0.0001
##    280        0.0786             nan     0.1000   -0.0001
##    300        0.0702             nan     0.1000   -0.0002
##    320        0.0610             nan     0.1000   -0.0000
##    340        0.0542             nan     0.1000   -0.0001
##    360        0.0478             nan     0.1000   -0.0001
##    380        0.0420             nan     0.1000   -0.0001
##    400        0.0372             nan     0.1000   -0.0002
##    420        0.0331             nan     0.1000   -0.0001
##    440        0.0296             nan     0.1000    0.0000
##    460        0.0264             nan     0.1000   -0.0000
##    480        0.0234             nan     0.1000    0.0000
##    500        0.0211             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2360             nan     0.1000    0.0367
##      2        1.1644             nan     0.1000    0.0303
##      3        1.0983             nan     0.1000    0.0302
##      4        1.0389             nan     0.1000    0.0248
##      5        0.9880             nan     0.1000    0.0219
##      6        0.9463             nan     0.1000    0.0163
##      7        0.9142             nan     0.1000    0.0131
##      8        0.8792             nan     0.1000    0.0135
##      9        0.8531             nan     0.1000    0.0093
##     10        0.8281             nan     0.1000    0.0081
##     20        0.6629             nan     0.1000    0.0016
##     40        0.5090             nan     0.1000   -0.0013
##     60        0.4115             nan     0.1000   -0.0011
##     80        0.3411             nan     0.1000   -0.0022
##    100        0.2831             nan     0.1000   -0.0011
##    120        0.2416             nan     0.1000   -0.0016
##    140        0.2082             nan     0.1000   -0.0007
##    160        0.1770             nan     0.1000   -0.0006
##    180        0.1530             nan     0.1000   -0.0008
##    200        0.1337             nan     0.1000   -0.0004
##    220        0.1144             nan     0.1000   -0.0001
##    240        0.1003             nan     0.1000   -0.0004
##    260        0.0876             nan     0.1000    0.0001
##    280        0.0773             nan     0.1000   -0.0003
##    300        0.0686             nan     0.1000   -0.0004
##    320        0.0612             nan     0.1000   -0.0003
##    340        0.0537             nan     0.1000   -0.0001
##    360        0.0475             nan     0.1000   -0.0002
##    380        0.0419             nan     0.1000   -0.0002
##    400        0.0371             nan     0.1000   -0.0002
##    420        0.0327             nan     0.1000   -0.0001
##    440        0.0291             nan     0.1000   -0.0001
##    460        0.0259             nan     0.1000   -0.0001
##    480        0.0231             nan     0.1000   -0.0002
##    500        0.0203             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2416             nan     0.1000    0.0364
##      2        1.1654             nan     0.1000    0.0367
##      3        1.0994             nan     0.1000    0.0257
##      4        1.0472             nan     0.1000    0.0212
##      5        1.0013             nan     0.1000    0.0192
##      6        0.9615             nan     0.1000    0.0153
##      7        0.9210             nan     0.1000    0.0178
##      8        0.8936             nan     0.1000    0.0112
##      9        0.8646             nan     0.1000    0.0127
##     10        0.8405             nan     0.1000    0.0097
##     20        0.6775             nan     0.1000    0.0012
##     40        0.5218             nan     0.1000    0.0004
##     60        0.4265             nan     0.1000   -0.0006
##     80        0.3591             nan     0.1000   -0.0014
##    100        0.3051             nan     0.1000   -0.0010
##    120        0.2610             nan     0.1000   -0.0019
##    140        0.2205             nan     0.1000   -0.0015
##    160        0.1930             nan     0.1000   -0.0007
##    180        0.1687             nan     0.1000   -0.0006
##    200        0.1450             nan     0.1000   -0.0004
##    220        0.1280             nan     0.1000   -0.0002
##    240        0.1122             nan     0.1000   -0.0006
##    260        0.1001             nan     0.1000   -0.0002
##    280        0.0879             nan     0.1000   -0.0004
##    300        0.0778             nan     0.1000   -0.0002
##    320        0.0686             nan     0.1000   -0.0004
##    340        0.0604             nan     0.1000   -0.0004
##    360        0.0542             nan     0.1000   -0.0003
##    380        0.0482             nan     0.1000   -0.0003
##    400        0.0428             nan     0.1000   -0.0002
##    420        0.0384             nan     0.1000   -0.0002
##    440        0.0343             nan     0.1000   -0.0001
##    460        0.0312             nan     0.1000   -0.0001
##    480        0.0276             nan     0.1000   -0.0002
##    500        0.0244             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0003
##     60        1.2706             nan     0.0010    0.0004
##     80        1.2552             nan     0.0010    0.0003
##    100        1.2405             nan     0.0010    0.0004
##    120        1.2263             nan     0.0010    0.0002
##    140        1.2125             nan     0.0010    0.0003
##    160        1.1990             nan     0.0010    0.0003
##    180        1.1861             nan     0.0010    0.0003
##    200        1.1732             nan     0.0010    0.0002
##    220        1.1611             nan     0.0010    0.0002
##    240        1.1491             nan     0.0010    0.0002
##    260        1.1374             nan     0.0010    0.0003
##    280        1.1261             nan     0.0010    0.0003
##    300        1.1153             nan     0.0010    0.0002
##    320        1.1045             nan     0.0010    0.0002
##    340        1.0940             nan     0.0010    0.0002
##    360        1.0840             nan     0.0010    0.0002
##    380        1.0742             nan     0.0010    0.0002
##    400        1.0646             nan     0.0010    0.0002
##    420        1.0552             nan     0.0010    0.0002
##    440        1.0461             nan     0.0010    0.0002
##    460        1.0374             nan     0.0010    0.0002
##    480        1.0288             nan     0.0010    0.0002
##    500        1.0202             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0005
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0003
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3034             nan     0.0010    0.0003
##     40        1.2867             nan     0.0010    0.0004
##     60        1.2707             nan     0.0010    0.0003
##     80        1.2552             nan     0.0010    0.0004
##    100        1.2403             nan     0.0010    0.0003
##    120        1.2258             nan     0.0010    0.0003
##    140        1.2115             nan     0.0010    0.0003
##    160        1.1981             nan     0.0010    0.0003
##    180        1.1848             nan     0.0010    0.0003
##    200        1.1720             nan     0.0010    0.0003
##    220        1.1597             nan     0.0010    0.0003
##    240        1.1480             nan     0.0010    0.0002
##    260        1.1363             nan     0.0010    0.0003
##    280        1.1251             nan     0.0010    0.0002
##    300        1.1142             nan     0.0010    0.0002
##    320        1.1038             nan     0.0010    0.0002
##    340        1.0935             nan     0.0010    0.0002
##    360        1.0837             nan     0.0010    0.0002
##    380        1.0739             nan     0.0010    0.0002
##    400        1.0645             nan     0.0010    0.0002
##    420        1.0553             nan     0.0010    0.0002
##    440        1.0463             nan     0.0010    0.0002
##    460        1.0376             nan     0.0010    0.0002
##    480        1.0290             nan     0.0010    0.0002
##    500        1.0208             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0003
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0003
##     40        1.2870             nan     0.0010    0.0004
##     60        1.2712             nan     0.0010    0.0004
##     80        1.2559             nan     0.0010    0.0003
##    100        1.2408             nan     0.0010    0.0003
##    120        1.2264             nan     0.0010    0.0003
##    140        1.2126             nan     0.0010    0.0003
##    160        1.1992             nan     0.0010    0.0002
##    180        1.1862             nan     0.0010    0.0003
##    200        1.1733             nan     0.0010    0.0003
##    220        1.1612             nan     0.0010    0.0003
##    240        1.1494             nan     0.0010    0.0003
##    260        1.1381             nan     0.0010    0.0003
##    280        1.1274             nan     0.0010    0.0003
##    300        1.1165             nan     0.0010    0.0003
##    320        1.1060             nan     0.0010    0.0002
##    340        1.0960             nan     0.0010    0.0002
##    360        1.0859             nan     0.0010    0.0002
##    380        1.0763             nan     0.0010    0.0002
##    400        1.0671             nan     0.0010    0.0002
##    420        1.0581             nan     0.0010    0.0002
##    440        1.0495             nan     0.0010    0.0002
##    460        1.0407             nan     0.0010    0.0002
##    480        1.0323             nan     0.0010    0.0002
##    500        1.0243             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0005
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0004
##     80        1.2518             nan     0.0010    0.0003
##    100        1.2359             nan     0.0010    0.0003
##    120        1.2205             nan     0.0010    0.0003
##    140        1.2057             nan     0.0010    0.0003
##    160        1.1912             nan     0.0010    0.0003
##    180        1.1773             nan     0.0010    0.0003
##    200        1.1642             nan     0.0010    0.0003
##    220        1.1514             nan     0.0010    0.0003
##    240        1.1386             nan     0.0010    0.0003
##    260        1.1265             nan     0.0010    0.0002
##    280        1.1149             nan     0.0010    0.0003
##    300        1.1034             nan     0.0010    0.0003
##    320        1.0921             nan     0.0010    0.0003
##    340        1.0811             nan     0.0010    0.0002
##    360        1.0706             nan     0.0010    0.0002
##    380        1.0602             nan     0.0010    0.0003
##    400        1.0503             nan     0.0010    0.0002
##    420        1.0405             nan     0.0010    0.0002
##    440        1.0308             nan     0.0010    0.0002
##    460        1.0213             nan     0.0010    0.0002
##    480        1.0125             nan     0.0010    0.0002
##    500        1.0037             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2682             nan     0.0010    0.0004
##     80        1.2518             nan     0.0010    0.0004
##    100        1.2364             nan     0.0010    0.0003
##    120        1.2213             nan     0.0010    0.0003
##    140        1.2066             nan     0.0010    0.0003
##    160        1.1926             nan     0.0010    0.0003
##    180        1.1787             nan     0.0010    0.0003
##    200        1.1655             nan     0.0010    0.0003
##    220        1.1526             nan     0.0010    0.0003
##    240        1.1399             nan     0.0010    0.0003
##    260        1.1279             nan     0.0010    0.0003
##    280        1.1160             nan     0.0010    0.0002
##    300        1.1047             nan     0.0010    0.0002
##    320        1.0937             nan     0.0010    0.0002
##    340        1.0829             nan     0.0010    0.0002
##    360        1.0726             nan     0.0010    0.0002
##    380        1.0625             nan     0.0010    0.0002
##    400        1.0526             nan     0.0010    0.0002
##    420        1.0429             nan     0.0010    0.0002
##    440        1.0333             nan     0.0010    0.0002
##    460        1.0242             nan     0.0010    0.0002
##    480        1.0153             nan     0.0010    0.0002
##    500        1.0067             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0005
##     40        1.2860             nan     0.0010    0.0004
##     60        1.2692             nan     0.0010    0.0004
##     80        1.2530             nan     0.0010    0.0004
##    100        1.2377             nan     0.0010    0.0003
##    120        1.2225             nan     0.0010    0.0004
##    140        1.2082             nan     0.0010    0.0003
##    160        1.1945             nan     0.0010    0.0003
##    180        1.1809             nan     0.0010    0.0003
##    200        1.1678             nan     0.0010    0.0003
##    220        1.1549             nan     0.0010    0.0002
##    240        1.1428             nan     0.0010    0.0002
##    260        1.1306             nan     0.0010    0.0003
##    280        1.1188             nan     0.0010    0.0003
##    300        1.1075             nan     0.0010    0.0003
##    320        1.0964             nan     0.0010    0.0002
##    340        1.0858             nan     0.0010    0.0002
##    360        1.0754             nan     0.0010    0.0002
##    380        1.0652             nan     0.0010    0.0002
##    400        1.0554             nan     0.0010    0.0002
##    420        1.0456             nan     0.0010    0.0002
##    440        1.0365             nan     0.0010    0.0002
##    460        1.0275             nan     0.0010    0.0002
##    480        1.0185             nan     0.0010    0.0002
##    500        1.0097             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3138             nan     0.0010    0.0004
##      8        1.3128             nan     0.0010    0.0004
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0004
##     20        1.3016             nan     0.0010    0.0005
##     40        1.2833             nan     0.0010    0.0004
##     60        1.2656             nan     0.0010    0.0004
##     80        1.2485             nan     0.0010    0.0003
##    100        1.2320             nan     0.0010    0.0003
##    120        1.2161             nan     0.0010    0.0003
##    140        1.2006             nan     0.0010    0.0004
##    160        1.1857             nan     0.0010    0.0003
##    180        1.1714             nan     0.0010    0.0003
##    200        1.1577             nan     0.0010    0.0002
##    220        1.1444             nan     0.0010    0.0003
##    240        1.1314             nan     0.0010    0.0003
##    260        1.1188             nan     0.0010    0.0003
##    280        1.1065             nan     0.0010    0.0003
##    300        1.0947             nan     0.0010    0.0003
##    320        1.0830             nan     0.0010    0.0003
##    340        1.0718             nan     0.0010    0.0002
##    360        1.0609             nan     0.0010    0.0002
##    380        1.0501             nan     0.0010    0.0002
##    400        1.0397             nan     0.0010    0.0002
##    420        1.0297             nan     0.0010    0.0002
##    440        1.0197             nan     0.0010    0.0002
##    460        1.0101             nan     0.0010    0.0002
##    480        1.0008             nan     0.0010    0.0002
##    500        0.9917             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0005
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0005
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2836             nan     0.0010    0.0004
##     60        1.2665             nan     0.0010    0.0004
##     80        1.2498             nan     0.0010    0.0003
##    100        1.2332             nan     0.0010    0.0004
##    120        1.2172             nan     0.0010    0.0003
##    140        1.2021             nan     0.0010    0.0003
##    160        1.1870             nan     0.0010    0.0003
##    180        1.1726             nan     0.0010    0.0003
##    200        1.1587             nan     0.0010    0.0003
##    220        1.1452             nan     0.0010    0.0003
##    240        1.1322             nan     0.0010    0.0003
##    260        1.1195             nan     0.0010    0.0003
##    280        1.1073             nan     0.0010    0.0002
##    300        1.0956             nan     0.0010    0.0002
##    320        1.0843             nan     0.0010    0.0003
##    340        1.0729             nan     0.0010    0.0003
##    360        1.0618             nan     0.0010    0.0002
##    380        1.0515             nan     0.0010    0.0002
##    400        1.0412             nan     0.0010    0.0002
##    420        1.0311             nan     0.0010    0.0002
##    440        1.0215             nan     0.0010    0.0001
##    460        1.0120             nan     0.0010    0.0002
##    480        1.0029             nan     0.0010    0.0002
##    500        0.9940             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0005
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0003
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2677             nan     0.0010    0.0003
##     80        1.2510             nan     0.0010    0.0004
##    100        1.2350             nan     0.0010    0.0004
##    120        1.2197             nan     0.0010    0.0004
##    140        1.2046             nan     0.0010    0.0003
##    160        1.1899             nan     0.0010    0.0003
##    180        1.1760             nan     0.0010    0.0003
##    200        1.1623             nan     0.0010    0.0003
##    220        1.1493             nan     0.0010    0.0003
##    240        1.1363             nan     0.0010    0.0003
##    260        1.1237             nan     0.0010    0.0003
##    280        1.1116             nan     0.0010    0.0002
##    300        1.0997             nan     0.0010    0.0002
##    320        1.0886             nan     0.0010    0.0003
##    340        1.0776             nan     0.0010    0.0003
##    360        1.0668             nan     0.0010    0.0002
##    380        1.0564             nan     0.0010    0.0002
##    400        1.0464             nan     0.0010    0.0002
##    420        1.0366             nan     0.0010    0.0002
##    440        1.0269             nan     0.0010    0.0002
##    460        1.0175             nan     0.0010    0.0002
##    480        1.0083             nan     0.0010    0.0002
##    500        0.9994             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0043
##      2        1.3027             nan     0.0100    0.0039
##      3        1.2949             nan     0.0100    0.0038
##      4        1.2864             nan     0.0100    0.0038
##      5        1.2783             nan     0.0100    0.0032
##      6        1.2699             nan     0.0100    0.0033
##      7        1.2623             nan     0.0100    0.0039
##      8        1.2551             nan     0.0100    0.0034
##      9        1.2479             nan     0.0100    0.0033
##     10        1.2400             nan     0.0100    0.0037
##     20        1.1710             nan     0.0100    0.0027
##     40        1.0651             nan     0.0100    0.0018
##     60        0.9848             nan     0.0100    0.0012
##     80        0.9193             nan     0.0100    0.0013
##    100        0.8687             nan     0.0100    0.0010
##    120        0.8263             nan     0.0100    0.0006
##    140        0.7923             nan     0.0100    0.0006
##    160        0.7643             nan     0.0100    0.0005
##    180        0.7391             nan     0.0100    0.0003
##    200        0.7170             nan     0.0100    0.0002
##    220        0.6980             nan     0.0100   -0.0000
##    240        0.6803             nan     0.0100    0.0002
##    260        0.6641             nan     0.0100    0.0001
##    280        0.6508             nan     0.0100   -0.0001
##    300        0.6387             nan     0.0100    0.0000
##    320        0.6273             nan     0.0100    0.0001
##    340        0.6168             nan     0.0100    0.0000
##    360        0.6058             nan     0.0100    0.0001
##    380        0.5946             nan     0.0100    0.0000
##    400        0.5856             nan     0.0100   -0.0001
##    420        0.5756             nan     0.0100   -0.0000
##    440        0.5669             nan     0.0100   -0.0001
##    460        0.5580             nan     0.0100   -0.0001
##    480        0.5495             nan     0.0100    0.0000
##    500        0.5415             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0043
##      2        1.3032             nan     0.0100    0.0038
##      3        1.2945             nan     0.0100    0.0037
##      4        1.2872             nan     0.0100    0.0032
##      5        1.2789             nan     0.0100    0.0041
##      6        1.2705             nan     0.0100    0.0038
##      7        1.2630             nan     0.0100    0.0033
##      8        1.2550             nan     0.0100    0.0034
##      9        1.2472             nan     0.0100    0.0035
##     10        1.2393             nan     0.0100    0.0033
##     20        1.1710             nan     0.0100    0.0025
##     40        1.0627             nan     0.0100    0.0020
##     60        0.9811             nan     0.0100    0.0014
##     80        0.9158             nan     0.0100    0.0011
##    100        0.8653             nan     0.0100    0.0010
##    120        0.8230             nan     0.0100    0.0007
##    140        0.7888             nan     0.0100    0.0006
##    160        0.7590             nan     0.0100    0.0005
##    180        0.7343             nan     0.0100    0.0003
##    200        0.7122             nan     0.0100    0.0002
##    220        0.6925             nan     0.0100    0.0000
##    240        0.6765             nan     0.0100    0.0001
##    260        0.6619             nan     0.0100   -0.0001
##    280        0.6476             nan     0.0100    0.0001
##    300        0.6346             nan     0.0100    0.0001
##    320        0.6238             nan     0.0100   -0.0001
##    340        0.6125             nan     0.0100   -0.0001
##    360        0.6015             nan     0.0100    0.0000
##    380        0.5917             nan     0.0100   -0.0000
##    400        0.5828             nan     0.0100   -0.0001
##    420        0.5741             nan     0.0100    0.0000
##    440        0.5648             nan     0.0100   -0.0000
##    460        0.5560             nan     0.0100   -0.0001
##    480        0.5474             nan     0.0100   -0.0000
##    500        0.5391             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0042
##      2        1.3035             nan     0.0100    0.0042
##      3        1.2954             nan     0.0100    0.0037
##      4        1.2871             nan     0.0100    0.0038
##      5        1.2793             nan     0.0100    0.0037
##      6        1.2719             nan     0.0100    0.0034
##      7        1.2641             nan     0.0100    0.0035
##      8        1.2569             nan     0.0100    0.0034
##      9        1.2493             nan     0.0100    0.0033
##     10        1.2419             nan     0.0100    0.0032
##     20        1.1753             nan     0.0100    0.0029
##     40        1.0657             nan     0.0100    0.0022
##     60        0.9821             nan     0.0100    0.0015
##     80        0.9192             nan     0.0100    0.0013
##    100        0.8684             nan     0.0100    0.0006
##    120        0.8274             nan     0.0100    0.0007
##    140        0.7933             nan     0.0100    0.0004
##    160        0.7649             nan     0.0100    0.0003
##    180        0.7402             nan     0.0100    0.0004
##    200        0.7187             nan     0.0100    0.0002
##    220        0.7001             nan     0.0100    0.0001
##    240        0.6851             nan     0.0100    0.0002
##    260        0.6692             nan     0.0100   -0.0000
##    280        0.6565             nan     0.0100    0.0001
##    300        0.6442             nan     0.0100    0.0001
##    320        0.6327             nan     0.0100   -0.0001
##    340        0.6217             nan     0.0100   -0.0001
##    360        0.6109             nan     0.0100   -0.0001
##    380        0.6015             nan     0.0100    0.0001
##    400        0.5917             nan     0.0100    0.0000
##    420        0.5828             nan     0.0100   -0.0000
##    440        0.5739             nan     0.0100   -0.0001
##    460        0.5644             nan     0.0100   -0.0002
##    480        0.5572             nan     0.0100    0.0001
##    500        0.5494             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0042
##      2        1.3023             nan     0.0100    0.0040
##      3        1.2930             nan     0.0100    0.0041
##      4        1.2849             nan     0.0100    0.0036
##      5        1.2762             nan     0.0100    0.0037
##      6        1.2679             nan     0.0100    0.0040
##      7        1.2601             nan     0.0100    0.0038
##      8        1.2521             nan     0.0100    0.0037
##      9        1.2444             nan     0.0100    0.0039
##     10        1.2367             nan     0.0100    0.0033
##     20        1.1665             nan     0.0100    0.0032
##     40        1.0525             nan     0.0100    0.0020
##     60        0.9674             nan     0.0100    0.0015
##     80        0.8981             nan     0.0100    0.0014
##    100        0.8441             nan     0.0100    0.0007
##    120        0.8011             nan     0.0100    0.0005
##    140        0.7648             nan     0.0100    0.0005
##    160        0.7328             nan     0.0100    0.0003
##    180        0.7061             nan     0.0100    0.0002
##    200        0.6819             nan     0.0100    0.0003
##    220        0.6605             nan     0.0100    0.0002
##    240        0.6417             nan     0.0100    0.0003
##    260        0.6248             nan     0.0100   -0.0000
##    280        0.6089             nan     0.0100    0.0001
##    300        0.5962             nan     0.0100   -0.0002
##    320        0.5834             nan     0.0100    0.0001
##    340        0.5700             nan     0.0100    0.0001
##    360        0.5577             nan     0.0100    0.0001
##    380        0.5463             nan     0.0100   -0.0000
##    400        0.5360             nan     0.0100    0.0000
##    420        0.5256             nan     0.0100    0.0001
##    440        0.5158             nan     0.0100   -0.0001
##    460        0.5056             nan     0.0100   -0.0000
##    480        0.4970             nan     0.0100   -0.0001
##    500        0.4875             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3026             nan     0.0100    0.0042
##      3        1.2940             nan     0.0100    0.0043
##      4        1.2854             nan     0.0100    0.0039
##      5        1.2767             nan     0.0100    0.0041
##      6        1.2685             nan     0.0100    0.0039
##      7        1.2607             nan     0.0100    0.0031
##      8        1.2522             nan     0.0100    0.0038
##      9        1.2444             nan     0.0100    0.0035
##     10        1.2366             nan     0.0100    0.0035
##     20        1.1656             nan     0.0100    0.0030
##     40        1.0540             nan     0.0100    0.0020
##     60        0.9689             nan     0.0100    0.0018
##     80        0.9006             nan     0.0100    0.0011
##    100        0.8465             nan     0.0100    0.0010
##    120        0.8023             nan     0.0100    0.0006
##    140        0.7652             nan     0.0100    0.0004
##    160        0.7352             nan     0.0100    0.0002
##    180        0.7090             nan     0.0100    0.0003
##    200        0.6860             nan     0.0100    0.0002
##    220        0.6670             nan     0.0100    0.0001
##    240        0.6488             nan     0.0100    0.0001
##    260        0.6323             nan     0.0100    0.0001
##    280        0.6161             nan     0.0100    0.0000
##    300        0.6001             nan     0.0100    0.0001
##    320        0.5866             nan     0.0100   -0.0000
##    340        0.5739             nan     0.0100   -0.0000
##    360        0.5625             nan     0.0100   -0.0001
##    380        0.5516             nan     0.0100   -0.0001
##    400        0.5412             nan     0.0100    0.0001
##    420        0.5308             nan     0.0100   -0.0001
##    440        0.5213             nan     0.0100    0.0000
##    460        0.5118             nan     0.0100   -0.0001
##    480        0.5036             nan     0.0100   -0.0001
##    500        0.4955             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0042
##      2        1.3021             nan     0.0100    0.0041
##      3        1.2940             nan     0.0100    0.0041
##      4        1.2851             nan     0.0100    0.0040
##      5        1.2767             nan     0.0100    0.0036
##      6        1.2682             nan     0.0100    0.0040
##      7        1.2596             nan     0.0100    0.0040
##      8        1.2521             nan     0.0100    0.0039
##      9        1.2444             nan     0.0100    0.0038
##     10        1.2370             nan     0.0100    0.0030
##     20        1.1678             nan     0.0100    0.0029
##     40        1.0539             nan     0.0100    0.0022
##     60        0.9677             nan     0.0100    0.0017
##     80        0.9018             nan     0.0100    0.0012
##    100        0.8488             nan     0.0100    0.0009
##    120        0.8071             nan     0.0100    0.0006
##    140        0.7717             nan     0.0100    0.0006
##    160        0.7413             nan     0.0100    0.0006
##    180        0.7154             nan     0.0100    0.0001
##    200        0.6937             nan     0.0100    0.0001
##    220        0.6737             nan     0.0100    0.0002
##    240        0.6555             nan     0.0100    0.0002
##    260        0.6385             nan     0.0100    0.0001
##    280        0.6226             nan     0.0100    0.0001
##    300        0.6083             nan     0.0100    0.0000
##    320        0.5957             nan     0.0100   -0.0000
##    340        0.5846             nan     0.0100    0.0001
##    360        0.5731             nan     0.0100   -0.0002
##    380        0.5616             nan     0.0100    0.0000
##    400        0.5509             nan     0.0100   -0.0001
##    420        0.5404             nan     0.0100   -0.0001
##    440        0.5306             nan     0.0100   -0.0000
##    460        0.5213             nan     0.0100   -0.0001
##    480        0.5127             nan     0.0100    0.0000
##    500        0.5040             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3105             nan     0.0100    0.0046
##      2        1.3006             nan     0.0100    0.0045
##      3        1.2916             nan     0.0100    0.0039
##      4        1.2823             nan     0.0100    0.0046
##      5        1.2732             nan     0.0100    0.0038
##      6        1.2646             nan     0.0100    0.0036
##      7        1.2564             nan     0.0100    0.0030
##      8        1.2483             nan     0.0100    0.0036
##      9        1.2405             nan     0.0100    0.0035
##     10        1.2323             nan     0.0100    0.0039
##     20        1.1577             nan     0.0100    0.0031
##     40        1.0383             nan     0.0100    0.0021
##     60        0.9479             nan     0.0100    0.0018
##     80        0.8786             nan     0.0100    0.0013
##    100        0.8227             nan     0.0100    0.0010
##    120        0.7755             nan     0.0100    0.0007
##    140        0.7375             nan     0.0100    0.0005
##    160        0.7052             nan     0.0100    0.0005
##    180        0.6785             nan     0.0100    0.0002
##    200        0.6540             nan     0.0100    0.0001
##    220        0.6311             nan     0.0100    0.0003
##    240        0.6120             nan     0.0100   -0.0001
##    260        0.5934             nan     0.0100    0.0001
##    280        0.5758             nan     0.0100    0.0002
##    300        0.5607             nan     0.0100    0.0000
##    320        0.5462             nan     0.0100   -0.0002
##    340        0.5328             nan     0.0100    0.0000
##    360        0.5196             nan     0.0100   -0.0000
##    380        0.5070             nan     0.0100   -0.0001
##    400        0.4945             nan     0.0100   -0.0001
##    420        0.4839             nan     0.0100   -0.0000
##    440        0.4738             nan     0.0100    0.0000
##    460        0.4642             nan     0.0100    0.0001
##    480        0.4544             nan     0.0100   -0.0001
##    500        0.4446             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0043
##      2        1.3019             nan     0.0100    0.0041
##      3        1.2926             nan     0.0100    0.0039
##      4        1.2834             nan     0.0100    0.0042
##      5        1.2748             nan     0.0100    0.0040
##      6        1.2661             nan     0.0100    0.0036
##      7        1.2570             nan     0.0100    0.0041
##      8        1.2488             nan     0.0100    0.0037
##      9        1.2406             nan     0.0100    0.0035
##     10        1.2331             nan     0.0100    0.0036
##     20        1.1583             nan     0.0100    0.0033
##     40        1.0416             nan     0.0100    0.0022
##     60        0.9517             nan     0.0100    0.0016
##     80        0.8816             nan     0.0100    0.0011
##    100        0.8247             nan     0.0100    0.0009
##    120        0.7781             nan     0.0100    0.0006
##    140        0.7405             nan     0.0100    0.0003
##    160        0.7095             nan     0.0100    0.0004
##    180        0.6818             nan     0.0100    0.0002
##    200        0.6569             nan     0.0100    0.0002
##    220        0.6350             nan     0.0100    0.0002
##    240        0.6173             nan     0.0100    0.0000
##    260        0.5993             nan     0.0100    0.0002
##    280        0.5819             nan     0.0100    0.0000
##    300        0.5666             nan     0.0100    0.0002
##    320        0.5534             nan     0.0100    0.0000
##    340        0.5402             nan     0.0100   -0.0000
##    360        0.5269             nan     0.0100   -0.0001
##    380        0.5148             nan     0.0100   -0.0001
##    400        0.5034             nan     0.0100    0.0000
##    420        0.4923             nan     0.0100   -0.0000
##    440        0.4817             nan     0.0100    0.0002
##    460        0.4705             nan     0.0100   -0.0002
##    480        0.4614             nan     0.0100    0.0000
##    500        0.4510             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0044
##      2        1.3025             nan     0.0100    0.0042
##      3        1.2938             nan     0.0100    0.0041
##      4        1.2850             nan     0.0100    0.0037
##      5        1.2767             nan     0.0100    0.0041
##      6        1.2681             nan     0.0100    0.0035
##      7        1.2594             nan     0.0100    0.0036
##      8        1.2513             nan     0.0100    0.0036
##      9        1.2432             nan     0.0100    0.0039
##     10        1.2352             nan     0.0100    0.0036
##     20        1.1644             nan     0.0100    0.0029
##     40        1.0489             nan     0.0100    0.0018
##     60        0.9600             nan     0.0100    0.0016
##     80        0.8907             nan     0.0100    0.0012
##    100        0.8356             nan     0.0100    0.0006
##    120        0.7890             nan     0.0100    0.0008
##    140        0.7520             nan     0.0100    0.0005
##    160        0.7187             nan     0.0100    0.0006
##    180        0.6915             nan     0.0100    0.0002
##    200        0.6673             nan     0.0100    0.0001
##    220        0.6444             nan     0.0100   -0.0001
##    240        0.6258             nan     0.0100   -0.0001
##    260        0.6075             nan     0.0100    0.0002
##    280        0.5922             nan     0.0100    0.0000
##    300        0.5778             nan     0.0100    0.0002
##    320        0.5653             nan     0.0100   -0.0000
##    340        0.5515             nan     0.0100    0.0002
##    360        0.5393             nan     0.0100   -0.0001
##    380        0.5274             nan     0.0100    0.0001
##    400        0.5159             nan     0.0100   -0.0000
##    420        0.5053             nan     0.0100   -0.0001
##    440        0.4946             nan     0.0100   -0.0001
##    460        0.4842             nan     0.0100   -0.0000
##    480        0.4747             nan     0.0100   -0.0001
##    500        0.4651             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2337             nan     0.1000    0.0392
##      2        1.1690             nan     0.1000    0.0307
##      3        1.1077             nan     0.1000    0.0251
##      4        1.0593             nan     0.1000    0.0213
##      5        1.0165             nan     0.1000    0.0154
##      6        0.9818             nan     0.1000    0.0132
##      7        0.9456             nan     0.1000    0.0128
##      8        0.9192             nan     0.1000    0.0114
##      9        0.8905             nan     0.1000    0.0111
##     10        0.8703             nan     0.1000    0.0069
##     20        0.7197             nan     0.1000    0.0012
##     40        0.5841             nan     0.1000    0.0009
##     60        0.5108             nan     0.1000   -0.0005
##     80        0.4516             nan     0.1000   -0.0006
##    100        0.4028             nan     0.1000   -0.0003
##    120        0.3538             nan     0.1000   -0.0019
##    140        0.3151             nan     0.1000   -0.0013
##    160        0.2817             nan     0.1000   -0.0009
##    180        0.2559             nan     0.1000   -0.0003
##    200        0.2306             nan     0.1000   -0.0002
##    220        0.2097             nan     0.1000    0.0001
##    240        0.1916             nan     0.1000   -0.0006
##    260        0.1767             nan     0.1000   -0.0002
##    280        0.1600             nan     0.1000   -0.0002
##    300        0.1463             nan     0.1000   -0.0003
##    320        0.1340             nan     0.1000   -0.0002
##    340        0.1204             nan     0.1000    0.0000
##    360        0.1100             nan     0.1000   -0.0002
##    380        0.1006             nan     0.1000   -0.0001
##    400        0.0931             nan     0.1000   -0.0002
##    420        0.0867             nan     0.1000   -0.0003
##    440        0.0795             nan     0.1000   -0.0003
##    460        0.0732             nan     0.1000   -0.0003
##    480        0.0679             nan     0.1000   -0.0002
##    500        0.0634             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2379             nan     0.1000    0.0399
##      2        1.1701             nan     0.1000    0.0307
##      3        1.1100             nan     0.1000    0.0240
##      4        1.0603             nan     0.1000    0.0212
##      5        1.0161             nan     0.1000    0.0195
##      6        0.9818             nan     0.1000    0.0133
##      7        0.9526             nan     0.1000    0.0112
##      8        0.9219             nan     0.1000    0.0143
##      9        0.8986             nan     0.1000    0.0098
##     10        0.8720             nan     0.1000    0.0099
##     20        0.7206             nan     0.1000   -0.0001
##     40        0.5895             nan     0.1000   -0.0015
##     60        0.5182             nan     0.1000   -0.0005
##     80        0.4565             nan     0.1000   -0.0008
##    100        0.4004             nan     0.1000   -0.0009
##    120        0.3546             nan     0.1000   -0.0009
##    140        0.3199             nan     0.1000   -0.0010
##    160        0.2923             nan     0.1000   -0.0009
##    180        0.2661             nan     0.1000   -0.0004
##    200        0.2416             nan     0.1000   -0.0005
##    220        0.2242             nan     0.1000   -0.0001
##    240        0.2037             nan     0.1000   -0.0007
##    260        0.1860             nan     0.1000   -0.0003
##    280        0.1696             nan     0.1000   -0.0003
##    300        0.1553             nan     0.1000   -0.0006
##    320        0.1439             nan     0.1000   -0.0009
##    340        0.1333             nan     0.1000   -0.0004
##    360        0.1228             nan     0.1000   -0.0006
##    380        0.1131             nan     0.1000   -0.0004
##    400        0.1047             nan     0.1000   -0.0004
##    420        0.0977             nan     0.1000   -0.0003
##    440        0.0894             nan     0.1000   -0.0002
##    460        0.0831             nan     0.1000   -0.0003
##    480        0.0771             nan     0.1000   -0.0002
##    500        0.0711             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2367             nan     0.1000    0.0380
##      2        1.1688             nan     0.1000    0.0303
##      3        1.1094             nan     0.1000    0.0282
##      4        1.0593             nan     0.1000    0.0214
##      5        1.0180             nan     0.1000    0.0177
##      6        0.9795             nan     0.1000    0.0157
##      7        0.9419             nan     0.1000    0.0144
##      8        0.9133             nan     0.1000    0.0119
##      9        0.8899             nan     0.1000    0.0085
##     10        0.8686             nan     0.1000    0.0077
##     20        0.7267             nan     0.1000    0.0023
##     40        0.5945             nan     0.1000   -0.0006
##     60        0.5221             nan     0.1000   -0.0002
##     80        0.4607             nan     0.1000   -0.0005
##    100        0.4196             nan     0.1000   -0.0007
##    120        0.3841             nan     0.1000   -0.0006
##    140        0.3422             nan     0.1000   -0.0000
##    160        0.3064             nan     0.1000   -0.0010
##    180        0.2829             nan     0.1000   -0.0010
##    200        0.2595             nan     0.1000   -0.0001
##    220        0.2365             nan     0.1000   -0.0005
##    240        0.2162             nan     0.1000   -0.0008
##    260        0.1997             nan     0.1000   -0.0001
##    280        0.1832             nan     0.1000   -0.0008
##    300        0.1677             nan     0.1000   -0.0007
##    320        0.1550             nan     0.1000   -0.0008
##    340        0.1406             nan     0.1000   -0.0002
##    360        0.1299             nan     0.1000   -0.0007
##    380        0.1199             nan     0.1000   -0.0004
##    400        0.1092             nan     0.1000   -0.0001
##    420        0.1011             nan     0.1000   -0.0006
##    440        0.0939             nan     0.1000   -0.0003
##    460        0.0864             nan     0.1000   -0.0002
##    480        0.0813             nan     0.1000   -0.0005
##    500        0.0752             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2352             nan     0.1000    0.0361
##      2        1.1587             nan     0.1000    0.0378
##      3        1.0954             nan     0.1000    0.0289
##      4        1.0418             nan     0.1000    0.0207
##      5        0.9956             nan     0.1000    0.0201
##      6        0.9566             nan     0.1000    0.0149
##      7        0.9245             nan     0.1000    0.0118
##      8        0.8899             nan     0.1000    0.0154
##      9        0.8620             nan     0.1000    0.0114
##     10        0.8359             nan     0.1000    0.0084
##     20        0.6763             nan     0.1000    0.0025
##     40        0.5433             nan     0.1000    0.0005
##     60        0.4520             nan     0.1000   -0.0005
##     80        0.3856             nan     0.1000   -0.0003
##    100        0.3382             nan     0.1000   -0.0005
##    120        0.2954             nan     0.1000   -0.0009
##    140        0.2597             nan     0.1000   -0.0003
##    160        0.2292             nan     0.1000   -0.0008
##    180        0.2055             nan     0.1000    0.0001
##    200        0.1797             nan     0.1000    0.0001
##    220        0.1583             nan     0.1000   -0.0003
##    240        0.1396             nan     0.1000   -0.0006
##    260        0.1234             nan     0.1000   -0.0002
##    280        0.1104             nan     0.1000   -0.0003
##    300        0.0993             nan     0.1000   -0.0002
##    320        0.0895             nan     0.1000   -0.0003
##    340        0.0807             nan     0.1000   -0.0001
##    360        0.0728             nan     0.1000   -0.0001
##    380        0.0661             nan     0.1000   -0.0002
##    400        0.0597             nan     0.1000   -0.0001
##    420        0.0540             nan     0.1000   -0.0003
##    440        0.0487             nan     0.1000   -0.0001
##    460        0.0447             nan     0.1000   -0.0000
##    480        0.0409             nan     0.1000   -0.0001
##    500        0.0367             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2348             nan     0.1000    0.0403
##      2        1.1645             nan     0.1000    0.0335
##      3        1.1017             nan     0.1000    0.0279
##      4        1.0505             nan     0.1000    0.0205
##      5        1.0067             nan     0.1000    0.0219
##      6        0.9663             nan     0.1000    0.0168
##      7        0.9313             nan     0.1000    0.0142
##      8        0.9046             nan     0.1000    0.0091
##      9        0.8769             nan     0.1000    0.0106
##     10        0.8496             nan     0.1000    0.0113
##     20        0.6862             nan     0.1000    0.0002
##     40        0.5537             nan     0.1000   -0.0035
##     60        0.4607             nan     0.1000   -0.0002
##     80        0.3971             nan     0.1000   -0.0018
##    100        0.3405             nan     0.1000   -0.0013
##    120        0.2992             nan     0.1000   -0.0011
##    140        0.2610             nan     0.1000   -0.0015
##    160        0.2296             nan     0.1000   -0.0008
##    180        0.2001             nan     0.1000   -0.0002
##    200        0.1786             nan     0.1000   -0.0006
##    220        0.1568             nan     0.1000   -0.0003
##    240        0.1377             nan     0.1000   -0.0003
##    260        0.1237             nan     0.1000   -0.0003
##    280        0.1097             nan     0.1000   -0.0003
##    300        0.0991             nan     0.1000   -0.0003
##    320        0.0905             nan     0.1000   -0.0001
##    340        0.0807             nan     0.1000   -0.0001
##    360        0.0732             nan     0.1000   -0.0003
##    380        0.0659             nan     0.1000   -0.0002
##    400        0.0596             nan     0.1000   -0.0005
##    420        0.0544             nan     0.1000   -0.0000
##    440        0.0497             nan     0.1000   -0.0002
##    460        0.0450             nan     0.1000   -0.0001
##    480        0.0409             nan     0.1000   -0.0001
##    500        0.0372             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2334             nan     0.1000    0.0391
##      2        1.1585             nan     0.1000    0.0318
##      3        1.0962             nan     0.1000    0.0278
##      4        1.0445             nan     0.1000    0.0229
##      5        0.9953             nan     0.1000    0.0214
##      6        0.9532             nan     0.1000    0.0164
##      7        0.9210             nan     0.1000    0.0136
##      8        0.8922             nan     0.1000    0.0122
##      9        0.8656             nan     0.1000    0.0096
##     10        0.8408             nan     0.1000    0.0088
##     20        0.6867             nan     0.1000    0.0020
##     40        0.5505             nan     0.1000    0.0005
##     60        0.4689             nan     0.1000   -0.0001
##     80        0.4071             nan     0.1000   -0.0012
##    100        0.3501             nan     0.1000   -0.0006
##    120        0.3076             nan     0.1000   -0.0014
##    140        0.2699             nan     0.1000   -0.0009
##    160        0.2396             nan     0.1000   -0.0009
##    180        0.2125             nan     0.1000   -0.0002
##    200        0.1897             nan     0.1000   -0.0006
##    220        0.1689             nan     0.1000   -0.0005
##    240        0.1506             nan     0.1000   -0.0009
##    260        0.1347             nan     0.1000   -0.0005
##    280        0.1204             nan     0.1000   -0.0002
##    300        0.1082             nan     0.1000   -0.0005
##    320        0.0982             nan     0.1000   -0.0006
##    340        0.0886             nan     0.1000   -0.0003
##    360        0.0808             nan     0.1000   -0.0003
##    380        0.0731             nan     0.1000   -0.0002
##    400        0.0671             nan     0.1000   -0.0002
##    420        0.0615             nan     0.1000   -0.0003
##    440        0.0556             nan     0.1000   -0.0001
##    460        0.0506             nan     0.1000   -0.0001
##    480        0.0458             nan     0.1000   -0.0001
##    500        0.0416             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2278             nan     0.1000    0.0408
##      2        1.1612             nan     0.1000    0.0271
##      3        1.0993             nan     0.1000    0.0240
##      4        1.0385             nan     0.1000    0.0243
##      5        0.9870             nan     0.1000    0.0236
##      6        0.9473             nan     0.1000    0.0167
##      7        0.9082             nan     0.1000    0.0162
##      8        0.8742             nan     0.1000    0.0126
##      9        0.8434             nan     0.1000    0.0110
##     10        0.8180             nan     0.1000    0.0093
##     20        0.6514             nan     0.1000    0.0033
##     40        0.4848             nan     0.1000    0.0008
##     60        0.3953             nan     0.1000   -0.0009
##     80        0.3238             nan     0.1000   -0.0011
##    100        0.2703             nan     0.1000   -0.0009
##    120        0.2294             nan     0.1000   -0.0005
##    140        0.1966             nan     0.1000   -0.0005
##    160        0.1667             nan     0.1000   -0.0005
##    180        0.1441             nan     0.1000   -0.0001
##    200        0.1249             nan     0.1000   -0.0006
##    220        0.1108             nan     0.1000   -0.0004
##    240        0.0950             nan     0.1000   -0.0003
##    260        0.0824             nan     0.1000   -0.0003
##    280        0.0730             nan     0.1000   -0.0000
##    300        0.0657             nan     0.1000   -0.0001
##    320        0.0586             nan     0.1000   -0.0002
##    340        0.0518             nan     0.1000    0.0001
##    360        0.0466             nan     0.1000   -0.0001
##    380        0.0413             nan     0.1000   -0.0002
##    400        0.0365             nan     0.1000   -0.0001
##    420        0.0322             nan     0.1000   -0.0001
##    440        0.0290             nan     0.1000   -0.0001
##    460        0.0260             nan     0.1000   -0.0002
##    480        0.0230             nan     0.1000   -0.0000
##    500        0.0205             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2256             nan     0.1000    0.0369
##      2        1.1489             nan     0.1000    0.0330
##      3        1.0807             nan     0.1000    0.0310
##      4        1.0243             nan     0.1000    0.0238
##      5        0.9795             nan     0.1000    0.0180
##      6        0.9342             nan     0.1000    0.0173
##      7        0.8992             nan     0.1000    0.0145
##      8        0.8627             nan     0.1000    0.0149
##      9        0.8343             nan     0.1000    0.0118
##     10        0.8089             nan     0.1000    0.0076
##     20        0.6499             nan     0.1000    0.0012
##     40        0.5033             nan     0.1000   -0.0004
##     60        0.4034             nan     0.1000   -0.0024
##     80        0.3358             nan     0.1000    0.0006
##    100        0.2801             nan     0.1000   -0.0015
##    120        0.2370             nan     0.1000   -0.0011
##    140        0.2026             nan     0.1000   -0.0010
##    160        0.1730             nan     0.1000   -0.0002
##    180        0.1488             nan     0.1000   -0.0008
##    200        0.1306             nan     0.1000   -0.0003
##    220        0.1152             nan     0.1000   -0.0003
##    240        0.1017             nan     0.1000    0.0000
##    260        0.0878             nan     0.1000   -0.0005
##    280        0.0773             nan     0.1000   -0.0002
##    300        0.0679             nan     0.1000   -0.0001
##    320        0.0607             nan     0.1000   -0.0001
##    340        0.0542             nan     0.1000   -0.0003
##    360        0.0482             nan     0.1000   -0.0002
##    380        0.0426             nan     0.1000   -0.0001
##    400        0.0375             nan     0.1000   -0.0000
##    420        0.0334             nan     0.1000   -0.0002
##    440        0.0298             nan     0.1000   -0.0002
##    460        0.0266             nan     0.1000   -0.0001
##    480        0.0234             nan     0.1000   -0.0001
##    500        0.0209             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2307             nan     0.1000    0.0413
##      2        1.1592             nan     0.1000    0.0298
##      3        1.1007             nan     0.1000    0.0263
##      4        1.0469             nan     0.1000    0.0231
##      5        0.9975             nan     0.1000    0.0208
##      6        0.9555             nan     0.1000    0.0167
##      7        0.9192             nan     0.1000    0.0128
##      8        0.8909             nan     0.1000    0.0096
##      9        0.8631             nan     0.1000    0.0102
##     10        0.8363             nan     0.1000    0.0095
##     20        0.6783             nan     0.1000    0.0007
##     40        0.5227             nan     0.1000    0.0004
##     60        0.4303             nan     0.1000   -0.0012
##     80        0.3577             nan     0.1000   -0.0006
##    100        0.3042             nan     0.1000   -0.0005
##    120        0.2632             nan     0.1000   -0.0007
##    140        0.2217             nan     0.1000   -0.0010
##    160        0.1936             nan     0.1000   -0.0006
##    180        0.1672             nan     0.1000   -0.0005
##    200        0.1459             nan     0.1000   -0.0006
##    220        0.1281             nan     0.1000   -0.0004
##    240        0.1137             nan     0.1000   -0.0004
##    260        0.1015             nan     0.1000   -0.0004
##    280        0.0902             nan     0.1000   -0.0002
##    300        0.0791             nan     0.1000   -0.0004
##    320        0.0696             nan     0.1000   -0.0003
##    340        0.0616             nan     0.1000   -0.0002
##    360        0.0554             nan     0.1000   -0.0003
##    380        0.0497             nan     0.1000   -0.0001
##    400        0.0447             nan     0.1000   -0.0002
##    420        0.0399             nan     0.1000   -0.0002
##    440        0.0360             nan     0.1000   -0.0001
##    460        0.0319             nan     0.1000   -0.0001
##    480        0.0285             nan     0.1000   -0.0002
##    500        0.0253             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0003
##     40        1.2878             nan     0.0010    0.0003
##     60        1.2721             nan     0.0010    0.0004
##     80        1.2571             nan     0.0010    0.0003
##    100        1.2425             nan     0.0010    0.0004
##    120        1.2284             nan     0.0010    0.0003
##    140        1.2147             nan     0.0010    0.0003
##    160        1.2013             nan     0.0010    0.0003
##    180        1.1886             nan     0.0010    0.0003
##    200        1.1762             nan     0.0010    0.0003
##    220        1.1639             nan     0.0010    0.0003
##    240        1.1521             nan     0.0010    0.0002
##    260        1.1407             nan     0.0010    0.0003
##    280        1.1297             nan     0.0010    0.0002
##    300        1.1189             nan     0.0010    0.0002
##    320        1.1087             nan     0.0010    0.0002
##    340        1.0986             nan     0.0010    0.0002
##    360        1.0889             nan     0.0010    0.0002
##    380        1.0794             nan     0.0010    0.0002
##    400        1.0700             nan     0.0010    0.0002
##    420        1.0610             nan     0.0010    0.0002
##    440        1.0521             nan     0.0010    0.0002
##    460        1.0434             nan     0.0010    0.0002
##    480        1.0348             nan     0.0010    0.0002
##    500        1.0266             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0003
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2877             nan     0.0010    0.0003
##     60        1.2722             nan     0.0010    0.0003
##     80        1.2572             nan     0.0010    0.0003
##    100        1.2429             nan     0.0010    0.0003
##    120        1.2291             nan     0.0010    0.0003
##    140        1.2155             nan     0.0010    0.0003
##    160        1.2025             nan     0.0010    0.0003
##    180        1.1896             nan     0.0010    0.0003
##    200        1.1772             nan     0.0010    0.0003
##    220        1.1652             nan     0.0010    0.0002
##    240        1.1535             nan     0.0010    0.0002
##    260        1.1423             nan     0.0010    0.0002
##    280        1.1313             nan     0.0010    0.0002
##    300        1.1207             nan     0.0010    0.0002
##    320        1.1103             nan     0.0010    0.0002
##    340        1.1003             nan     0.0010    0.0002
##    360        1.0903             nan     0.0010    0.0002
##    380        1.0809             nan     0.0010    0.0002
##    400        1.0715             nan     0.0010    0.0002
##    420        1.0625             nan     0.0010    0.0002
##    440        1.0537             nan     0.0010    0.0002
##    460        1.0448             nan     0.0010    0.0002
##    480        1.0366             nan     0.0010    0.0002
##    500        1.0286             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0003
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0003
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3042             nan     0.0010    0.0004
##     40        1.2882             nan     0.0010    0.0004
##     60        1.2727             nan     0.0010    0.0003
##     80        1.2577             nan     0.0010    0.0003
##    100        1.2430             nan     0.0010    0.0003
##    120        1.2291             nan     0.0010    0.0003
##    140        1.2154             nan     0.0010    0.0003
##    160        1.2022             nan     0.0010    0.0003
##    180        1.1894             nan     0.0010    0.0003
##    200        1.1771             nan     0.0010    0.0003
##    220        1.1652             nan     0.0010    0.0003
##    240        1.1537             nan     0.0010    0.0003
##    260        1.1425             nan     0.0010    0.0002
##    280        1.1316             nan     0.0010    0.0002
##    300        1.1210             nan     0.0010    0.0003
##    320        1.1107             nan     0.0010    0.0002
##    340        1.1009             nan     0.0010    0.0002
##    360        1.0911             nan     0.0010    0.0002
##    380        1.0815             nan     0.0010    0.0002
##    400        1.0723             nan     0.0010    0.0002
##    420        1.0631             nan     0.0010    0.0002
##    440        1.0542             nan     0.0010    0.0002
##    460        1.0456             nan     0.0010    0.0002
##    480        1.0373             nan     0.0010    0.0002
##    500        1.0291             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0005
##      5        1.3159             nan     0.0010    0.0005
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0003
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2855             nan     0.0010    0.0003
##     60        1.2691             nan     0.0010    0.0004
##     80        1.2530             nan     0.0010    0.0003
##    100        1.2377             nan     0.0010    0.0003
##    120        1.2228             nan     0.0010    0.0003
##    140        1.2083             nan     0.0010    0.0003
##    160        1.1941             nan     0.0010    0.0003
##    180        1.1802             nan     0.0010    0.0003
##    200        1.1668             nan     0.0010    0.0003
##    220        1.1540             nan     0.0010    0.0003
##    240        1.1417             nan     0.0010    0.0002
##    260        1.1296             nan     0.0010    0.0002
##    280        1.1179             nan     0.0010    0.0003
##    300        1.1064             nan     0.0010    0.0003
##    320        1.0953             nan     0.0010    0.0003
##    340        1.0844             nan     0.0010    0.0002
##    360        1.0743             nan     0.0010    0.0002
##    380        1.0641             nan     0.0010    0.0002
##    400        1.0541             nan     0.0010    0.0002
##    420        1.0445             nan     0.0010    0.0002
##    440        1.0350             nan     0.0010    0.0002
##    460        1.0261             nan     0.0010    0.0002
##    480        1.0172             nan     0.0010    0.0002
##    500        1.0084             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2691             nan     0.0010    0.0004
##     80        1.2534             nan     0.0010    0.0004
##    100        1.2378             nan     0.0010    0.0004
##    120        1.2228             nan     0.0010    0.0003
##    140        1.2087             nan     0.0010    0.0004
##    160        1.1945             nan     0.0010    0.0003
##    180        1.1808             nan     0.0010    0.0003
##    200        1.1676             nan     0.0010    0.0003
##    220        1.1547             nan     0.0010    0.0003
##    240        1.1424             nan     0.0010    0.0002
##    260        1.1304             nan     0.0010    0.0003
##    280        1.1184             nan     0.0010    0.0002
##    300        1.1072             nan     0.0010    0.0002
##    320        1.0961             nan     0.0010    0.0003
##    340        1.0852             nan     0.0010    0.0002
##    360        1.0751             nan     0.0010    0.0002
##    380        1.0650             nan     0.0010    0.0002
##    400        1.0551             nan     0.0010    0.0002
##    420        1.0455             nan     0.0010    0.0002
##    440        1.0361             nan     0.0010    0.0002
##    460        1.0268             nan     0.0010    0.0002
##    480        1.0180             nan     0.0010    0.0002
##    500        1.0094             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0005
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0005
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0003
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2854             nan     0.0010    0.0003
##     60        1.2691             nan     0.0010    0.0004
##     80        1.2530             nan     0.0010    0.0004
##    100        1.2374             nan     0.0010    0.0003
##    120        1.2226             nan     0.0010    0.0003
##    140        1.2080             nan     0.0010    0.0003
##    160        1.1938             nan     0.0010    0.0003
##    180        1.1806             nan     0.0010    0.0003
##    200        1.1673             nan     0.0010    0.0003
##    220        1.1546             nan     0.0010    0.0003
##    240        1.1422             nan     0.0010    0.0003
##    260        1.1300             nan     0.0010    0.0003
##    280        1.1184             nan     0.0010    0.0002
##    300        1.1071             nan     0.0010    0.0003
##    320        1.0960             nan     0.0010    0.0002
##    340        1.0856             nan     0.0010    0.0002
##    360        1.0753             nan     0.0010    0.0002
##    380        1.0653             nan     0.0010    0.0002
##    400        1.0557             nan     0.0010    0.0002
##    420        1.0463             nan     0.0010    0.0002
##    440        1.0370             nan     0.0010    0.0002
##    460        1.0277             nan     0.0010    0.0002
##    480        1.0189             nan     0.0010    0.0002
##    500        1.0103             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3186             nan     0.0010    0.0005
##      3        1.3176             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0005
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0005
##     10        1.3111             nan     0.0010    0.0004
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2836             nan     0.0010    0.0004
##     60        1.2659             nan     0.0010    0.0004
##     80        1.2488             nan     0.0010    0.0004
##    100        1.2323             nan     0.0010    0.0003
##    120        1.2163             nan     0.0010    0.0003
##    140        1.2008             nan     0.0010    0.0003
##    160        1.1860             nan     0.0010    0.0003
##    180        1.1722             nan     0.0010    0.0003
##    200        1.1585             nan     0.0010    0.0003
##    220        1.1451             nan     0.0010    0.0003
##    240        1.1320             nan     0.0010    0.0003
##    260        1.1193             nan     0.0010    0.0003
##    280        1.1072             nan     0.0010    0.0003
##    300        1.0952             nan     0.0010    0.0002
##    320        1.0836             nan     0.0010    0.0003
##    340        1.0726             nan     0.0010    0.0003
##    360        1.0616             nan     0.0010    0.0002
##    380        1.0509             nan     0.0010    0.0002
##    400        1.0405             nan     0.0010    0.0002
##    420        1.0302             nan     0.0010    0.0002
##    440        1.0203             nan     0.0010    0.0002
##    460        1.0106             nan     0.0010    0.0002
##    480        1.0013             nan     0.0010    0.0002
##    500        0.9924             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0005
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0005
##     10        1.3113             nan     0.0010    0.0005
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2842             nan     0.0010    0.0004
##     60        1.2667             nan     0.0010    0.0004
##     80        1.2499             nan     0.0010    0.0004
##    100        1.2338             nan     0.0010    0.0004
##    120        1.2182             nan     0.0010    0.0004
##    140        1.2029             nan     0.0010    0.0003
##    160        1.1883             nan     0.0010    0.0003
##    180        1.1737             nan     0.0010    0.0003
##    200        1.1599             nan     0.0010    0.0003
##    220        1.1463             nan     0.0010    0.0003
##    240        1.1334             nan     0.0010    0.0003
##    260        1.1209             nan     0.0010    0.0003
##    280        1.1083             nan     0.0010    0.0003
##    300        1.0966             nan     0.0010    0.0003
##    320        1.0850             nan     0.0010    0.0002
##    340        1.0738             nan     0.0010    0.0002
##    360        1.0631             nan     0.0010    0.0002
##    380        1.0525             nan     0.0010    0.0002
##    400        1.0420             nan     0.0010    0.0002
##    420        1.0320             nan     0.0010    0.0002
##    440        1.0225             nan     0.0010    0.0002
##    460        1.0129             nan     0.0010    0.0002
##    480        1.0035             nan     0.0010    0.0002
##    500        0.9946             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0005
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0004
##     40        1.2843             nan     0.0010    0.0004
##     60        1.2672             nan     0.0010    0.0003
##     80        1.2505             nan     0.0010    0.0004
##    100        1.2347             nan     0.0010    0.0003
##    120        1.2196             nan     0.0010    0.0004
##    140        1.2043             nan     0.0010    0.0003
##    160        1.1896             nan     0.0010    0.0003
##    180        1.1753             nan     0.0010    0.0003
##    200        1.1617             nan     0.0010    0.0003
##    220        1.1485             nan     0.0010    0.0003
##    240        1.1356             nan     0.0010    0.0003
##    260        1.1233             nan     0.0010    0.0003
##    280        1.1112             nan     0.0010    0.0002
##    300        1.0996             nan     0.0010    0.0002
##    320        1.0882             nan     0.0010    0.0002
##    340        1.0771             nan     0.0010    0.0002
##    360        1.0664             nan     0.0010    0.0002
##    380        1.0560             nan     0.0010    0.0002
##    400        1.0459             nan     0.0010    0.0002
##    420        1.0362             nan     0.0010    0.0002
##    440        1.0265             nan     0.0010    0.0002
##    460        1.0173             nan     0.0010    0.0002
##    480        1.0080             nan     0.0010    0.0002
##    500        0.9989             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0038
##      2        1.3037             nan     0.0100    0.0036
##      3        1.2957             nan     0.0100    0.0036
##      4        1.2873             nan     0.0100    0.0036
##      5        1.2797             nan     0.0100    0.0034
##      6        1.2711             nan     0.0100    0.0036
##      7        1.2642             nan     0.0100    0.0031
##      8        1.2569             nan     0.0100    0.0032
##      9        1.2500             nan     0.0100    0.0030
##     10        1.2432             nan     0.0100    0.0032
##     20        1.1746             nan     0.0100    0.0029
##     40        1.0700             nan     0.0100    0.0018
##     60        0.9879             nan     0.0100    0.0014
##     80        0.9245             nan     0.0100    0.0011
##    100        0.8739             nan     0.0100    0.0009
##    120        0.8317             nan     0.0100    0.0004
##    140        0.7969             nan     0.0100    0.0004
##    160        0.7670             nan     0.0100    0.0003
##    180        0.7420             nan     0.0100    0.0001
##    200        0.7194             nan     0.0100    0.0002
##    220        0.7011             nan     0.0100    0.0002
##    240        0.6837             nan     0.0100    0.0000
##    260        0.6692             nan     0.0100    0.0000
##    280        0.6549             nan     0.0100    0.0001
##    300        0.6419             nan     0.0100   -0.0000
##    320        0.6292             nan     0.0100    0.0002
##    340        0.6174             nan     0.0100    0.0001
##    360        0.6058             nan     0.0100    0.0000
##    380        0.5948             nan     0.0100    0.0001
##    400        0.5846             nan     0.0100    0.0000
##    420        0.5760             nan     0.0100   -0.0001
##    440        0.5666             nan     0.0100   -0.0000
##    460        0.5581             nan     0.0100   -0.0001
##    480        0.5488             nan     0.0100    0.0000
##    500        0.5404             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0036
##      2        1.3041             nan     0.0100    0.0035
##      3        1.2960             nan     0.0100    0.0039
##      4        1.2882             nan     0.0100    0.0034
##      5        1.2802             nan     0.0100    0.0034
##      6        1.2725             nan     0.0100    0.0038
##      7        1.2651             nan     0.0100    0.0033
##      8        1.2573             nan     0.0100    0.0033
##      9        1.2496             nan     0.0100    0.0036
##     10        1.2429             nan     0.0100    0.0028
##     20        1.1758             nan     0.0100    0.0031
##     40        1.0696             nan     0.0100    0.0018
##     60        0.9907             nan     0.0100    0.0013
##     80        0.9244             nan     0.0100    0.0012
##    100        0.8730             nan     0.0100    0.0007
##    120        0.8305             nan     0.0100    0.0006
##    140        0.7953             nan     0.0100    0.0007
##    160        0.7667             nan     0.0100    0.0005
##    180        0.7422             nan     0.0100    0.0003
##    200        0.7201             nan     0.0100    0.0004
##    220        0.7015             nan     0.0100    0.0003
##    240        0.6855             nan     0.0100    0.0000
##    260        0.6704             nan     0.0100   -0.0001
##    280        0.6568             nan     0.0100   -0.0000
##    300        0.6437             nan     0.0100    0.0001
##    320        0.6320             nan     0.0100    0.0001
##    340        0.6216             nan     0.0100    0.0001
##    360        0.6109             nan     0.0100   -0.0000
##    380        0.6017             nan     0.0100    0.0001
##    400        0.5923             nan     0.0100   -0.0001
##    420        0.5839             nan     0.0100   -0.0001
##    440        0.5750             nan     0.0100   -0.0001
##    460        0.5672             nan     0.0100    0.0000
##    480        0.5581             nan     0.0100   -0.0001
##    500        0.5502             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0042
##      2        1.3039             nan     0.0100    0.0038
##      3        1.2961             nan     0.0100    0.0036
##      4        1.2884             nan     0.0100    0.0036
##      5        1.2809             nan     0.0100    0.0035
##      6        1.2732             nan     0.0100    0.0031
##      7        1.2663             nan     0.0100    0.0032
##      8        1.2595             nan     0.0100    0.0031
##      9        1.2528             nan     0.0100    0.0033
##     10        1.2455             nan     0.0100    0.0037
##     20        1.1790             nan     0.0100    0.0029
##     40        1.0755             nan     0.0100    0.0017
##     60        0.9948             nan     0.0100    0.0011
##     80        0.9301             nan     0.0100    0.0011
##    100        0.8780             nan     0.0100    0.0009
##    120        0.8360             nan     0.0100    0.0005
##    140        0.8005             nan     0.0100    0.0005
##    160        0.7721             nan     0.0100    0.0004
##    180        0.7466             nan     0.0100    0.0003
##    200        0.7267             nan     0.0100    0.0002
##    220        0.7078             nan     0.0100   -0.0001
##    240        0.6912             nan     0.0100    0.0002
##    260        0.6771             nan     0.0100    0.0002
##    280        0.6632             nan     0.0100    0.0000
##    300        0.6504             nan     0.0100   -0.0000
##    320        0.6392             nan     0.0100    0.0001
##    340        0.6278             nan     0.0100    0.0001
##    360        0.6174             nan     0.0100   -0.0001
##    380        0.6077             nan     0.0100    0.0001
##    400        0.5988             nan     0.0100   -0.0000
##    420        0.5900             nan     0.0100   -0.0001
##    440        0.5817             nan     0.0100   -0.0000
##    460        0.5738             nan     0.0100    0.0000
##    480        0.5654             nan     0.0100    0.0000
##    500        0.5580             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3026             nan     0.0100    0.0042
##      3        1.2939             nan     0.0100    0.0039
##      4        1.2861             nan     0.0100    0.0035
##      5        1.2772             nan     0.0100    0.0037
##      6        1.2689             nan     0.0100    0.0034
##      7        1.2608             nan     0.0100    0.0037
##      8        1.2522             nan     0.0100    0.0035
##      9        1.2448             nan     0.0100    0.0033
##     10        1.2364             nan     0.0100    0.0038
##     20        1.1657             nan     0.0100    0.0032
##     40        1.0539             nan     0.0100    0.0022
##     60        0.9672             nan     0.0100    0.0016
##     80        0.8995             nan     0.0100    0.0011
##    100        0.8456             nan     0.0100    0.0009
##    120        0.8015             nan     0.0100    0.0007
##    140        0.7648             nan     0.0100    0.0004
##    160        0.7347             nan     0.0100    0.0005
##    180        0.7072             nan     0.0100    0.0002
##    200        0.6846             nan     0.0100    0.0003
##    220        0.6625             nan     0.0100    0.0001
##    240        0.6442             nan     0.0100    0.0002
##    260        0.6268             nan     0.0100    0.0003
##    280        0.6105             nan     0.0100    0.0001
##    300        0.5960             nan     0.0100    0.0000
##    320        0.5822             nan     0.0100   -0.0001
##    340        0.5698             nan     0.0100    0.0000
##    360        0.5580             nan     0.0100    0.0001
##    380        0.5473             nan     0.0100   -0.0000
##    400        0.5369             nan     0.0100    0.0001
##    420        0.5251             nan     0.0100    0.0001
##    440        0.5150             nan     0.0100   -0.0001
##    460        0.5052             nan     0.0100    0.0001
##    480        0.4968             nan     0.0100   -0.0001
##    500        0.4880             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0038
##      2        1.3030             nan     0.0100    0.0039
##      3        1.2941             nan     0.0100    0.0039
##      4        1.2850             nan     0.0100    0.0039
##      5        1.2769             nan     0.0100    0.0033
##      6        1.2681             nan     0.0100    0.0040
##      7        1.2596             nan     0.0100    0.0035
##      8        1.2510             nan     0.0100    0.0037
##      9        1.2434             nan     0.0100    0.0037
##     10        1.2348             nan     0.0100    0.0037
##     20        1.1642             nan     0.0100    0.0033
##     40        1.0529             nan     0.0100    0.0020
##     60        0.9687             nan     0.0100    0.0017
##     80        0.9037             nan     0.0100    0.0013
##    100        0.8500             nan     0.0100    0.0010
##    120        0.8067             nan     0.0100    0.0006
##    140        0.7692             nan     0.0100    0.0007
##    160        0.7382             nan     0.0100    0.0004
##    180        0.7125             nan     0.0100    0.0003
##    200        0.6904             nan     0.0100    0.0004
##    220        0.6696             nan     0.0100    0.0002
##    240        0.6511             nan     0.0100    0.0001
##    260        0.6341             nan     0.0100    0.0001
##    280        0.6189             nan     0.0100    0.0002
##    300        0.6054             nan     0.0100    0.0000
##    320        0.5918             nan     0.0100   -0.0001
##    340        0.5799             nan     0.0100   -0.0000
##    360        0.5676             nan     0.0100    0.0000
##    380        0.5557             nan     0.0100   -0.0000
##    400        0.5443             nan     0.0100   -0.0001
##    420        0.5339             nan     0.0100   -0.0001
##    440        0.5241             nan     0.0100   -0.0000
##    460        0.5133             nan     0.0100    0.0000
##    480        0.5039             nan     0.0100    0.0001
##    500        0.4952             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0038
##      2        1.3031             nan     0.0100    0.0037
##      3        1.2949             nan     0.0100    0.0038
##      4        1.2857             nan     0.0100    0.0041
##      5        1.2770             nan     0.0100    0.0041
##      6        1.2691             nan     0.0100    0.0038
##      7        1.2604             nan     0.0100    0.0040
##      8        1.2526             nan     0.0100    0.0032
##      9        1.2441             nan     0.0100    0.0040
##     10        1.2363             nan     0.0100    0.0035
##     20        1.1679             nan     0.0100    0.0025
##     40        1.0532             nan     0.0100    0.0020
##     60        0.9679             nan     0.0100    0.0015
##     80        0.9025             nan     0.0100    0.0009
##    100        0.8498             nan     0.0100    0.0009
##    120        0.8069             nan     0.0100    0.0005
##    140        0.7717             nan     0.0100    0.0005
##    160        0.7430             nan     0.0100    0.0004
##    180        0.7168             nan     0.0100    0.0003
##    200        0.6952             nan     0.0100    0.0002
##    220        0.6745             nan     0.0100    0.0001
##    240        0.6575             nan     0.0100    0.0001
##    260        0.6413             nan     0.0100    0.0000
##    280        0.6270             nan     0.0100   -0.0001
##    300        0.6137             nan     0.0100    0.0001
##    320        0.6013             nan     0.0100    0.0001
##    340        0.5887             nan     0.0100   -0.0000
##    360        0.5771             nan     0.0100    0.0001
##    380        0.5658             nan     0.0100    0.0000
##    400        0.5554             nan     0.0100   -0.0001
##    420        0.5455             nan     0.0100    0.0000
##    440        0.5361             nan     0.0100   -0.0001
##    460        0.5264             nan     0.0100   -0.0000
##    480        0.5171             nan     0.0100   -0.0002
##    500        0.5088             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3108             nan     0.0100    0.0045
##      2        1.3012             nan     0.0100    0.0046
##      3        1.2918             nan     0.0100    0.0045
##      4        1.2824             nan     0.0100    0.0044
##      5        1.2737             nan     0.0100    0.0040
##      6        1.2649             nan     0.0100    0.0040
##      7        1.2562             nan     0.0100    0.0038
##      8        1.2483             nan     0.0100    0.0039
##      9        1.2402             nan     0.0100    0.0031
##     10        1.2319             nan     0.0100    0.0038
##     20        1.1586             nan     0.0100    0.0027
##     40        1.0401             nan     0.0100    0.0022
##     60        0.9509             nan     0.0100    0.0014
##     80        0.8819             nan     0.0100    0.0013
##    100        0.8257             nan     0.0100    0.0010
##    120        0.7800             nan     0.0100    0.0009
##    140        0.7403             nan     0.0100    0.0003
##    160        0.7092             nan     0.0100    0.0003
##    180        0.6819             nan     0.0100    0.0003
##    200        0.6576             nan     0.0100    0.0001
##    220        0.6357             nan     0.0100    0.0003
##    240        0.6145             nan     0.0100    0.0003
##    260        0.5962             nan     0.0100    0.0001
##    280        0.5784             nan     0.0100    0.0002
##    300        0.5631             nan     0.0100    0.0000
##    320        0.5472             nan     0.0100    0.0002
##    340        0.5336             nan     0.0100   -0.0001
##    360        0.5191             nan     0.0100    0.0001
##    380        0.5068             nan     0.0100   -0.0001
##    400        0.4951             nan     0.0100    0.0001
##    420        0.4832             nan     0.0100   -0.0000
##    440        0.4723             nan     0.0100    0.0000
##    460        0.4624             nan     0.0100   -0.0000
##    480        0.4522             nan     0.0100   -0.0000
##    500        0.4426             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3023             nan     0.0100    0.0044
##      3        1.2933             nan     0.0100    0.0044
##      4        1.2839             nan     0.0100    0.0042
##      5        1.2739             nan     0.0100    0.0041
##      6        1.2658             nan     0.0100    0.0038
##      7        1.2578             nan     0.0100    0.0036
##      8        1.2503             nan     0.0100    0.0036
##      9        1.2418             nan     0.0100    0.0036
##     10        1.2327             nan     0.0100    0.0037
##     20        1.1570             nan     0.0100    0.0032
##     40        1.0400             nan     0.0100    0.0018
##     60        0.9505             nan     0.0100    0.0015
##     80        0.8804             nan     0.0100    0.0014
##    100        0.8243             nan     0.0100    0.0007
##    120        0.7783             nan     0.0100    0.0008
##    140        0.7421             nan     0.0100    0.0004
##    160        0.7109             nan     0.0100    0.0003
##    180        0.6831             nan     0.0100    0.0002
##    200        0.6578             nan     0.0100    0.0002
##    220        0.6376             nan     0.0100   -0.0000
##    240        0.6165             nan     0.0100    0.0001
##    260        0.5986             nan     0.0100    0.0003
##    280        0.5821             nan     0.0100    0.0003
##    300        0.5668             nan     0.0100   -0.0002
##    320        0.5529             nan     0.0100    0.0000
##    340        0.5393             nan     0.0100   -0.0001
##    360        0.5268             nan     0.0100   -0.0001
##    380        0.5143             nan     0.0100    0.0000
##    400        0.5030             nan     0.0100   -0.0003
##    420        0.4907             nan     0.0100    0.0001
##    440        0.4797             nan     0.0100    0.0001
##    460        0.4695             nan     0.0100    0.0000
##    480        0.4603             nan     0.0100   -0.0000
##    500        0.4516             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0045
##      2        1.3023             nan     0.0100    0.0041
##      3        1.2935             nan     0.0100    0.0040
##      4        1.2853             nan     0.0100    0.0038
##      5        1.2764             nan     0.0100    0.0043
##      6        1.2673             nan     0.0100    0.0041
##      7        1.2592             nan     0.0100    0.0036
##      8        1.2503             nan     0.0100    0.0041
##      9        1.2424             nan     0.0100    0.0038
##     10        1.2337             nan     0.0100    0.0035
##     20        1.1603             nan     0.0100    0.0028
##     40        1.0434             nan     0.0100    0.0023
##     60        0.9569             nan     0.0100    0.0017
##     80        0.8892             nan     0.0100    0.0010
##    100        0.8339             nan     0.0100    0.0010
##    120        0.7884             nan     0.0100    0.0006
##    140        0.7507             nan     0.0100    0.0004
##    160        0.7187             nan     0.0100    0.0003
##    180        0.6906             nan     0.0100    0.0001
##    200        0.6669             nan     0.0100    0.0003
##    220        0.6468             nan     0.0100    0.0002
##    240        0.6268             nan     0.0100    0.0001
##    260        0.6091             nan     0.0100    0.0001
##    280        0.5935             nan     0.0100    0.0001
##    300        0.5786             nan     0.0100   -0.0001
##    320        0.5645             nan     0.0100   -0.0000
##    340        0.5512             nan     0.0100    0.0001
##    360        0.5393             nan     0.0100   -0.0001
##    380        0.5277             nan     0.0100    0.0000
##    400        0.5175             nan     0.0100   -0.0001
##    420        0.5071             nan     0.0100   -0.0003
##    440        0.4969             nan     0.0100    0.0002
##    460        0.4866             nan     0.0100   -0.0002
##    480        0.4765             nan     0.0100   -0.0000
##    500        0.4663             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2421             nan     0.1000    0.0360
##      2        1.1745             nan     0.1000    0.0309
##      3        1.1186             nan     0.1000    0.0236
##      4        1.0661             nan     0.1000    0.0221
##      5        1.0243             nan     0.1000    0.0174
##      6        0.9889             nan     0.1000    0.0128
##      7        0.9577             nan     0.1000    0.0127
##      8        0.9232             nan     0.1000    0.0139
##      9        0.8942             nan     0.1000    0.0115
##     10        0.8699             nan     0.1000    0.0090
##     20        0.7230             nan     0.1000    0.0006
##     40        0.5935             nan     0.1000    0.0016
##     60        0.5053             nan     0.1000    0.0014
##     80        0.4435             nan     0.1000    0.0003
##    100        0.3895             nan     0.1000   -0.0009
##    120        0.3434             nan     0.1000   -0.0005
##    140        0.3083             nan     0.1000   -0.0003
##    160        0.2794             nan     0.1000   -0.0004
##    180        0.2531             nan     0.1000   -0.0011
##    200        0.2295             nan     0.1000   -0.0005
##    220        0.2042             nan     0.1000   -0.0012
##    240        0.1846             nan     0.1000    0.0003
##    260        0.1665             nan     0.1000   -0.0002
##    280        0.1526             nan     0.1000   -0.0001
##    300        0.1383             nan     0.1000   -0.0001
##    320        0.1279             nan     0.1000   -0.0002
##    340        0.1177             nan     0.1000   -0.0006
##    360        0.1093             nan     0.1000   -0.0003
##    380        0.0995             nan     0.1000   -0.0005
##    400        0.0913             nan     0.1000   -0.0003
##    420        0.0843             nan     0.1000   -0.0001
##    440        0.0790             nan     0.1000   -0.0001
##    460        0.0726             nan     0.1000    0.0000
##    480        0.0667             nan     0.1000   -0.0001
##    500        0.0614             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2320             nan     0.1000    0.0393
##      2        1.1630             nan     0.1000    0.0292
##      3        1.1049             nan     0.1000    0.0266
##      4        1.0584             nan     0.1000    0.0212
##      5        1.0144             nan     0.1000    0.0192
##      6        0.9793             nan     0.1000    0.0152
##      7        0.9469             nan     0.1000    0.0131
##      8        0.9194             nan     0.1000    0.0112
##      9        0.8943             nan     0.1000    0.0110
##     10        0.8719             nan     0.1000    0.0097
##     20        0.7220             nan     0.1000    0.0018
##     40        0.5906             nan     0.1000   -0.0009
##     60        0.5067             nan     0.1000   -0.0007
##     80        0.4486             nan     0.1000   -0.0002
##    100        0.3989             nan     0.1000   -0.0011
##    120        0.3550             nan     0.1000   -0.0006
##    140        0.3216             nan     0.1000   -0.0005
##    160        0.2877             nan     0.1000   -0.0001
##    180        0.2637             nan     0.1000   -0.0009
##    200        0.2403             nan     0.1000   -0.0004
##    220        0.2203             nan     0.1000   -0.0003
##    240        0.1983             nan     0.1000   -0.0009
##    260        0.1816             nan     0.1000   -0.0004
##    280        0.1648             nan     0.1000   -0.0001
##    300        0.1503             nan     0.1000   -0.0006
##    320        0.1389             nan     0.1000   -0.0007
##    340        0.1274             nan     0.1000   -0.0002
##    360        0.1183             nan     0.1000   -0.0002
##    380        0.1096             nan     0.1000   -0.0003
##    400        0.1016             nan     0.1000   -0.0004
##    420        0.0944             nan     0.1000   -0.0007
##    440        0.0870             nan     0.1000   -0.0004
##    460        0.0807             nan     0.1000    0.0001
##    480        0.0747             nan     0.1000   -0.0001
##    500        0.0684             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2391             nan     0.1000    0.0374
##      2        1.1726             nan     0.1000    0.0339
##      3        1.1180             nan     0.1000    0.0255
##      4        1.0736             nan     0.1000    0.0196
##      5        1.0306             nan     0.1000    0.0191
##      6        0.9940             nan     0.1000    0.0157
##      7        0.9576             nan     0.1000    0.0140
##      8        0.9298             nan     0.1000    0.0104
##      9        0.8999             nan     0.1000    0.0106
##     10        0.8768             nan     0.1000    0.0102
##     20        0.7321             nan     0.1000    0.0013
##     40        0.6123             nan     0.1000    0.0004
##     60        0.5358             nan     0.1000   -0.0033
##     80        0.4672             nan     0.1000   -0.0004
##    100        0.4119             nan     0.1000   -0.0008
##    120        0.3707             nan     0.1000   -0.0003
##    140        0.3345             nan     0.1000   -0.0006
##    160        0.3012             nan     0.1000   -0.0009
##    180        0.2732             nan     0.1000   -0.0007
##    200        0.2487             nan     0.1000   -0.0007
##    220        0.2282             nan     0.1000   -0.0006
##    240        0.2120             nan     0.1000   -0.0006
##    260        0.1950             nan     0.1000   -0.0008
##    280        0.1784             nan     0.1000   -0.0004
##    300        0.1651             nan     0.1000   -0.0003
##    320        0.1519             nan     0.1000   -0.0004
##    340        0.1397             nan     0.1000   -0.0004
##    360        0.1286             nan     0.1000   -0.0003
##    380        0.1187             nan     0.1000   -0.0004
##    400        0.1101             nan     0.1000   -0.0004
##    420        0.1011             nan     0.1000   -0.0006
##    440        0.0944             nan     0.1000   -0.0003
##    460        0.0873             nan     0.1000   -0.0004
##    480        0.0806             nan     0.1000   -0.0002
##    500        0.0756             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2291             nan     0.1000    0.0392
##      2        1.1582             nan     0.1000    0.0350
##      3        1.1002             nan     0.1000    0.0249
##      4        1.0481             nan     0.1000    0.0213
##      5        1.0032             nan     0.1000    0.0181
##      6        0.9614             nan     0.1000    0.0179
##      7        0.9210             nan     0.1000    0.0164
##      8        0.8883             nan     0.1000    0.0138
##      9        0.8563             nan     0.1000    0.0139
##     10        0.8318             nan     0.1000    0.0117
##     20        0.6736             nan     0.1000    0.0027
##     40        0.5299             nan     0.1000   -0.0014
##     60        0.4375             nan     0.1000   -0.0014
##     80        0.3740             nan     0.1000    0.0004
##    100        0.3244             nan     0.1000   -0.0021
##    120        0.2793             nan     0.1000   -0.0007
##    140        0.2432             nan     0.1000   -0.0000
##    160        0.2140             nan     0.1000   -0.0001
##    180        0.1906             nan     0.1000   -0.0005
##    200        0.1679             nan     0.1000   -0.0003
##    220        0.1488             nan     0.1000   -0.0001
##    240        0.1327             nan     0.1000   -0.0000
##    260        0.1172             nan     0.1000   -0.0003
##    280        0.1050             nan     0.1000    0.0000
##    300        0.0957             nan     0.1000   -0.0001
##    320        0.0866             nan     0.1000   -0.0003
##    340        0.0780             nan     0.1000   -0.0001
##    360        0.0704             nan     0.1000   -0.0001
##    380        0.0635             nan     0.1000   -0.0001
##    400        0.0577             nan     0.1000   -0.0001
##    420        0.0523             nan     0.1000   -0.0001
##    440        0.0473             nan     0.1000   -0.0001
##    460        0.0430             nan     0.1000   -0.0002
##    480        0.0387             nan     0.1000   -0.0002
##    500        0.0350             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2350             nan     0.1000    0.0414
##      2        1.1585             nan     0.1000    0.0321
##      3        1.0995             nan     0.1000    0.0281
##      4        1.0494             nan     0.1000    0.0204
##      5        1.0084             nan     0.1000    0.0175
##      6        0.9715             nan     0.1000    0.0183
##      7        0.9372             nan     0.1000    0.0138
##      8        0.9048             nan     0.1000    0.0127
##      9        0.8754             nan     0.1000    0.0113
##     10        0.8474             nan     0.1000    0.0124
##     20        0.6965             nan     0.1000    0.0024
##     40        0.5512             nan     0.1000    0.0001
##     60        0.4637             nan     0.1000   -0.0009
##     80        0.3966             nan     0.1000   -0.0009
##    100        0.3408             nan     0.1000   -0.0007
##    120        0.2996             nan     0.1000   -0.0012
##    140        0.2620             nan     0.1000   -0.0004
##    160        0.2272             nan     0.1000   -0.0002
##    180        0.2000             nan     0.1000   -0.0013
##    200        0.1790             nan     0.1000   -0.0007
##    220        0.1602             nan     0.1000   -0.0005
##    240        0.1441             nan     0.1000   -0.0005
##    260        0.1278             nan     0.1000   -0.0004
##    280        0.1158             nan     0.1000   -0.0005
##    300        0.1037             nan     0.1000   -0.0002
##    320        0.0939             nan     0.1000   -0.0003
##    340        0.0846             nan     0.1000   -0.0002
##    360        0.0760             nan     0.1000   -0.0004
##    380        0.0692             nan     0.1000   -0.0002
##    400        0.0618             nan     0.1000   -0.0000
##    420        0.0560             nan     0.1000   -0.0001
##    440        0.0509             nan     0.1000   -0.0002
##    460        0.0456             nan     0.1000   -0.0001
##    480        0.0412             nan     0.1000   -0.0002
##    500        0.0373             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2325             nan     0.1000    0.0383
##      2        1.1511             nan     0.1000    0.0335
##      3        1.0909             nan     0.1000    0.0261
##      4        1.0436             nan     0.1000    0.0236
##      5        1.0025             nan     0.1000    0.0175
##      6        0.9615             nan     0.1000    0.0174
##      7        0.9295             nan     0.1000    0.0132
##      8        0.8985             nan     0.1000    0.0121
##      9        0.8732             nan     0.1000    0.0098
##     10        0.8449             nan     0.1000    0.0107
##     20        0.6955             nan     0.1000    0.0022
##     40        0.5615             nan     0.1000   -0.0002
##     60        0.4828             nan     0.1000   -0.0014
##     80        0.4142             nan     0.1000   -0.0008
##    100        0.3555             nan     0.1000    0.0000
##    120        0.3105             nan     0.1000   -0.0010
##    140        0.2753             nan     0.1000   -0.0012
##    160        0.2436             nan     0.1000   -0.0009
##    180        0.2173             nan     0.1000   -0.0003
##    200        0.1953             nan     0.1000   -0.0000
##    220        0.1713             nan     0.1000   -0.0003
##    240        0.1529             nan     0.1000   -0.0004
##    260        0.1367             nan     0.1000   -0.0000
##    280        0.1222             nan     0.1000   -0.0003
##    300        0.1110             nan     0.1000   -0.0006
##    320        0.1010             nan     0.1000   -0.0007
##    340        0.0909             nan     0.1000   -0.0003
##    360        0.0833             nan     0.1000   -0.0001
##    380        0.0754             nan     0.1000   -0.0002
##    400        0.0682             nan     0.1000   -0.0003
##    420        0.0621             nan     0.1000   -0.0004
##    440        0.0561             nan     0.1000   -0.0003
##    460        0.0510             nan     0.1000   -0.0002
##    480        0.0468             nan     0.1000   -0.0002
##    500        0.0427             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2270             nan     0.1000    0.0439
##      2        1.1514             nan     0.1000    0.0352
##      3        1.0846             nan     0.1000    0.0292
##      4        1.0275             nan     0.1000    0.0237
##      5        0.9821             nan     0.1000    0.0195
##      6        0.9369             nan     0.1000    0.0175
##      7        0.9041             nan     0.1000    0.0105
##      8        0.8735             nan     0.1000    0.0105
##      9        0.8428             nan     0.1000    0.0104
##     10        0.8193             nan     0.1000    0.0068
##     20        0.6568             nan     0.1000    0.0020
##     40        0.4914             nan     0.1000    0.0008
##     60        0.4047             nan     0.1000   -0.0019
##     80        0.3360             nan     0.1000   -0.0006
##    100        0.2801             nan     0.1000   -0.0009
##    120        0.2355             nan     0.1000   -0.0007
##    140        0.2027             nan     0.1000   -0.0005
##    160        0.1725             nan     0.1000   -0.0007
##    180        0.1501             nan     0.1000   -0.0001
##    200        0.1315             nan     0.1000   -0.0001
##    220        0.1154             nan     0.1000   -0.0000
##    240        0.0995             nan     0.1000   -0.0002
##    260        0.0851             nan     0.1000    0.0001
##    280        0.0751             nan     0.1000   -0.0003
##    300        0.0650             nan     0.1000   -0.0000
##    320        0.0576             nan     0.1000   -0.0002
##    340        0.0512             nan     0.1000   -0.0000
##    360        0.0449             nan     0.1000   -0.0000
##    380        0.0393             nan     0.1000   -0.0001
##    400        0.0348             nan     0.1000   -0.0000
##    420        0.0312             nan     0.1000   -0.0000
##    440        0.0277             nan     0.1000   -0.0001
##    460        0.0248             nan     0.1000   -0.0001
##    480        0.0221             nan     0.1000   -0.0001
##    500        0.0194             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2373             nan     0.1000    0.0366
##      2        1.1574             nan     0.1000    0.0348
##      3        1.0967             nan     0.1000    0.0291
##      4        1.0447             nan     0.1000    0.0251
##      5        0.9985             nan     0.1000    0.0199
##      6        0.9577             nan     0.1000    0.0181
##      7        0.9219             nan     0.1000    0.0144
##      8        0.8901             nan     0.1000    0.0121
##      9        0.8653             nan     0.1000    0.0091
##     10        0.8386             nan     0.1000    0.0104
##     20        0.6714             nan     0.1000    0.0021
##     40        0.5116             nan     0.1000   -0.0019
##     60        0.4142             nan     0.1000    0.0001
##     80        0.3401             nan     0.1000    0.0010
##    100        0.2891             nan     0.1000   -0.0014
##    120        0.2475             nan     0.1000   -0.0005
##    140        0.2132             nan     0.1000   -0.0008
##    160        0.1815             nan     0.1000   -0.0008
##    180        0.1580             nan     0.1000   -0.0004
##    200        0.1355             nan     0.1000   -0.0003
##    220        0.1184             nan     0.1000   -0.0005
##    240        0.1038             nan     0.1000   -0.0004
##    260        0.0913             nan     0.1000   -0.0003
##    280        0.0802             nan     0.1000   -0.0004
##    300        0.0711             nan     0.1000   -0.0005
##    320        0.0626             nan     0.1000   -0.0001
##    340        0.0552             nan     0.1000   -0.0001
##    360        0.0494             nan     0.1000   -0.0002
##    380        0.0435             nan     0.1000   -0.0001
##    400        0.0385             nan     0.1000   -0.0001
##    420        0.0345             nan     0.1000   -0.0001
##    440        0.0307             nan     0.1000   -0.0000
##    460        0.0276             nan     0.1000   -0.0001
##    480        0.0244             nan     0.1000   -0.0001
##    500        0.0220             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2301             nan     0.1000    0.0381
##      2        1.1508             nan     0.1000    0.0316
##      3        1.0905             nan     0.1000    0.0249
##      4        1.0317             nan     0.1000    0.0238
##      5        0.9916             nan     0.1000    0.0170
##      6        0.9547             nan     0.1000    0.0154
##      7        0.9219             nan     0.1000    0.0145
##      8        0.8917             nan     0.1000    0.0125
##      9        0.8612             nan     0.1000    0.0099
##     10        0.8355             nan     0.1000    0.0095
##     20        0.6814             nan     0.1000   -0.0003
##     40        0.5344             nan     0.1000    0.0003
##     60        0.4413             nan     0.1000   -0.0021
##     80        0.3684             nan     0.1000   -0.0016
##    100        0.3131             nan     0.1000   -0.0007
##    120        0.2670             nan     0.1000   -0.0003
##    140        0.2287             nan     0.1000   -0.0014
##    160        0.1982             nan     0.1000   -0.0008
##    180        0.1719             nan     0.1000   -0.0007
##    200        0.1485             nan     0.1000   -0.0004
##    220        0.1296             nan     0.1000   -0.0010
##    240        0.1136             nan     0.1000   -0.0006
##    260        0.0999             nan     0.1000   -0.0006
##    280        0.0885             nan     0.1000   -0.0006
##    300        0.0781             nan     0.1000   -0.0002
##    320        0.0688             nan     0.1000   -0.0003
##    340        0.0613             nan     0.1000   -0.0003
##    360        0.0550             nan     0.1000   -0.0001
##    380        0.0489             nan     0.1000   -0.0002
##    400        0.0433             nan     0.1000   -0.0001
##    420        0.0388             nan     0.1000   -0.0001
##    440        0.0346             nan     0.1000   -0.0002
##    460        0.0312             nan     0.1000   -0.0001
##    480        0.0281             nan     0.1000   -0.0001
##    500        0.0251             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3191             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0003
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2875             nan     0.0010    0.0004
##     60        1.2721             nan     0.0010    0.0003
##     80        1.2572             nan     0.0010    0.0004
##    100        1.2427             nan     0.0010    0.0003
##    120        1.2284             nan     0.0010    0.0003
##    140        1.2150             nan     0.0010    0.0003
##    160        1.2017             nan     0.0010    0.0003
##    180        1.1888             nan     0.0010    0.0003
##    200        1.1765             nan     0.0010    0.0002
##    220        1.1647             nan     0.0010    0.0002
##    240        1.1529             nan     0.0010    0.0002
##    260        1.1414             nan     0.0010    0.0003
##    280        1.1305             nan     0.0010    0.0002
##    300        1.1197             nan     0.0010    0.0002
##    320        1.1093             nan     0.0010    0.0002
##    340        1.0992             nan     0.0010    0.0002
##    360        1.0892             nan     0.0010    0.0002
##    380        1.0795             nan     0.0010    0.0002
##    400        1.0700             nan     0.0010    0.0002
##    420        1.0610             nan     0.0010    0.0002
##    440        1.0521             nan     0.0010    0.0002
##    460        1.0434             nan     0.0010    0.0002
##    480        1.0349             nan     0.0010    0.0002
##    500        1.0268             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3042             nan     0.0010    0.0004
##     40        1.2882             nan     0.0010    0.0004
##     60        1.2726             nan     0.0010    0.0004
##     80        1.2576             nan     0.0010    0.0003
##    100        1.2429             nan     0.0010    0.0003
##    120        1.2287             nan     0.0010    0.0003
##    140        1.2152             nan     0.0010    0.0003
##    160        1.2022             nan     0.0010    0.0003
##    180        1.1895             nan     0.0010    0.0003
##    200        1.1770             nan     0.0010    0.0003
##    220        1.1650             nan     0.0010    0.0003
##    240        1.1532             nan     0.0010    0.0003
##    260        1.1419             nan     0.0010    0.0003
##    280        1.1305             nan     0.0010    0.0002
##    300        1.1200             nan     0.0010    0.0002
##    320        1.1097             nan     0.0010    0.0002
##    340        1.0994             nan     0.0010    0.0003
##    360        1.0894             nan     0.0010    0.0002
##    380        1.0799             nan     0.0010    0.0002
##    400        1.0705             nan     0.0010    0.0002
##    420        1.0611             nan     0.0010    0.0002
##    440        1.0522             nan     0.0010    0.0002
##    460        1.0438             nan     0.0010    0.0002
##    480        1.0356             nan     0.0010    0.0002
##    500        1.0273             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0003
##      5        1.3165             nan     0.0010    0.0003
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0003
##     20        1.3043             nan     0.0010    0.0003
##     40        1.2885             nan     0.0010    0.0004
##     60        1.2729             nan     0.0010    0.0004
##     80        1.2579             nan     0.0010    0.0004
##    100        1.2437             nan     0.0010    0.0003
##    120        1.2296             nan     0.0010    0.0003
##    140        1.2160             nan     0.0010    0.0003
##    160        1.2031             nan     0.0010    0.0003
##    180        1.1906             nan     0.0010    0.0003
##    200        1.1782             nan     0.0010    0.0003
##    220        1.1662             nan     0.0010    0.0003
##    240        1.1548             nan     0.0010    0.0003
##    260        1.1436             nan     0.0010    0.0002
##    280        1.1327             nan     0.0010    0.0002
##    300        1.1219             nan     0.0010    0.0002
##    320        1.1114             nan     0.0010    0.0002
##    340        1.1012             nan     0.0010    0.0002
##    360        1.0914             nan     0.0010    0.0002
##    380        1.0818             nan     0.0010    0.0002
##    400        1.0724             nan     0.0010    0.0002
##    420        1.0633             nan     0.0010    0.0002
##    440        1.0545             nan     0.0010    0.0002
##    460        1.0458             nan     0.0010    0.0002
##    480        1.0375             nan     0.0010    0.0002
##    500        1.0291             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2692             nan     0.0010    0.0004
##     80        1.2526             nan     0.0010    0.0004
##    100        1.2372             nan     0.0010    0.0003
##    120        1.2224             nan     0.0010    0.0003
##    140        1.2076             nan     0.0010    0.0003
##    160        1.1939             nan     0.0010    0.0003
##    180        1.1804             nan     0.0010    0.0003
##    200        1.1673             nan     0.0010    0.0003
##    220        1.1543             nan     0.0010    0.0003
##    240        1.1420             nan     0.0010    0.0003
##    260        1.1302             nan     0.0010    0.0003
##    280        1.1188             nan     0.0010    0.0002
##    300        1.1071             nan     0.0010    0.0003
##    320        1.0961             nan     0.0010    0.0002
##    340        1.0855             nan     0.0010    0.0002
##    360        1.0748             nan     0.0010    0.0002
##    380        1.0644             nan     0.0010    0.0002
##    400        1.0544             nan     0.0010    0.0002
##    420        1.0447             nan     0.0010    0.0002
##    440        1.0352             nan     0.0010    0.0002
##    460        1.0263             nan     0.0010    0.0002
##    480        1.0172             nan     0.0010    0.0002
##    500        1.0089             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2858             nan     0.0010    0.0004
##     60        1.2690             nan     0.0010    0.0003
##     80        1.2527             nan     0.0010    0.0004
##    100        1.2373             nan     0.0010    0.0003
##    120        1.2222             nan     0.0010    0.0003
##    140        1.2082             nan     0.0010    0.0003
##    160        1.1943             nan     0.0010    0.0003
##    180        1.1807             nan     0.0010    0.0002
##    200        1.1675             nan     0.0010    0.0003
##    220        1.1544             nan     0.0010    0.0003
##    240        1.1419             nan     0.0010    0.0002
##    260        1.1295             nan     0.0010    0.0003
##    280        1.1179             nan     0.0010    0.0003
##    300        1.1066             nan     0.0010    0.0002
##    320        1.0956             nan     0.0010    0.0002
##    340        1.0852             nan     0.0010    0.0002
##    360        1.0749             nan     0.0010    0.0002
##    380        1.0646             nan     0.0010    0.0002
##    400        1.0548             nan     0.0010    0.0002
##    420        1.0450             nan     0.0010    0.0002
##    440        1.0356             nan     0.0010    0.0002
##    460        1.0265             nan     0.0010    0.0002
##    480        1.0176             nan     0.0010    0.0002
##    500        1.0087             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0003
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2862             nan     0.0010    0.0004
##     60        1.2698             nan     0.0010    0.0003
##     80        1.2543             nan     0.0010    0.0003
##    100        1.2390             nan     0.0010    0.0003
##    120        1.2246             nan     0.0010    0.0004
##    140        1.2101             nan     0.0010    0.0003
##    160        1.1960             nan     0.0010    0.0003
##    180        1.1826             nan     0.0010    0.0002
##    200        1.1697             nan     0.0010    0.0003
##    220        1.1570             nan     0.0010    0.0003
##    240        1.1448             nan     0.0010    0.0003
##    260        1.1331             nan     0.0010    0.0003
##    280        1.1215             nan     0.0010    0.0002
##    300        1.1102             nan     0.0010    0.0002
##    320        1.0993             nan     0.0010    0.0002
##    340        1.0887             nan     0.0010    0.0002
##    360        1.0783             nan     0.0010    0.0002
##    380        1.0682             nan     0.0010    0.0002
##    400        1.0585             nan     0.0010    0.0002
##    420        1.0489             nan     0.0010    0.0002
##    440        1.0393             nan     0.0010    0.0002
##    460        1.0303             nan     0.0010    0.0002
##    480        1.0218             nan     0.0010    0.0002
##    500        1.0131             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0005
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2839             nan     0.0010    0.0004
##     60        1.2665             nan     0.0010    0.0004
##     80        1.2495             nan     0.0010    0.0003
##    100        1.2332             nan     0.0010    0.0004
##    120        1.2173             nan     0.0010    0.0003
##    140        1.2022             nan     0.0010    0.0003
##    160        1.1875             nan     0.0010    0.0003
##    180        1.1729             nan     0.0010    0.0004
##    200        1.1591             nan     0.0010    0.0003
##    220        1.1457             nan     0.0010    0.0003
##    240        1.1324             nan     0.0010    0.0003
##    260        1.1200             nan     0.0010    0.0003
##    280        1.1079             nan     0.0010    0.0003
##    300        1.0960             nan     0.0010    0.0002
##    320        1.0845             nan     0.0010    0.0003
##    340        1.0731             nan     0.0010    0.0002
##    360        1.0625             nan     0.0010    0.0002
##    380        1.0519             nan     0.0010    0.0002
##    400        1.0414             nan     0.0010    0.0002
##    420        1.0314             nan     0.0010    0.0002
##    440        1.0217             nan     0.0010    0.0002
##    460        1.0122             nan     0.0010    0.0002
##    480        1.0027             nan     0.0010    0.0002
##    500        0.9938             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2668             nan     0.0010    0.0004
##     80        1.2498             nan     0.0010    0.0003
##    100        1.2339             nan     0.0010    0.0003
##    120        1.2184             nan     0.0010    0.0003
##    140        1.2032             nan     0.0010    0.0003
##    160        1.1887             nan     0.0010    0.0003
##    180        1.1745             nan     0.0010    0.0003
##    200        1.1605             nan     0.0010    0.0003
##    220        1.1474             nan     0.0010    0.0003
##    240        1.1345             nan     0.0010    0.0003
##    260        1.1221             nan     0.0010    0.0003
##    280        1.1102             nan     0.0010    0.0002
##    300        1.0985             nan     0.0010    0.0003
##    320        1.0870             nan     0.0010    0.0002
##    340        1.0757             nan     0.0010    0.0002
##    360        1.0648             nan     0.0010    0.0002
##    380        1.0543             nan     0.0010    0.0002
##    400        1.0440             nan     0.0010    0.0002
##    420        1.0341             nan     0.0010    0.0002
##    440        1.0242             nan     0.0010    0.0002
##    460        1.0149             nan     0.0010    0.0002
##    480        1.0056             nan     0.0010    0.0002
##    500        0.9967             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2850             nan     0.0010    0.0004
##     60        1.2684             nan     0.0010    0.0004
##     80        1.2519             nan     0.0010    0.0004
##    100        1.2361             nan     0.0010    0.0003
##    120        1.2208             nan     0.0010    0.0003
##    140        1.2059             nan     0.0010    0.0003
##    160        1.1914             nan     0.0010    0.0003
##    180        1.1776             nan     0.0010    0.0003
##    200        1.1640             nan     0.0010    0.0003
##    220        1.1509             nan     0.0010    0.0003
##    240        1.1380             nan     0.0010    0.0003
##    260        1.1257             nan     0.0010    0.0002
##    280        1.1138             nan     0.0010    0.0003
##    300        1.1023             nan     0.0010    0.0002
##    320        1.0910             nan     0.0010    0.0002
##    340        1.0800             nan     0.0010    0.0002
##    360        1.0696             nan     0.0010    0.0002
##    380        1.0590             nan     0.0010    0.0002
##    400        1.0489             nan     0.0010    0.0002
##    420        1.0390             nan     0.0010    0.0002
##    440        1.0296             nan     0.0010    0.0002
##    460        1.0201             nan     0.0010    0.0002
##    480        1.0108             nan     0.0010    0.0002
##    500        1.0020             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0038
##      2        1.3046             nan     0.0100    0.0038
##      3        1.2969             nan     0.0100    0.0037
##      4        1.2886             nan     0.0100    0.0036
##      5        1.2809             nan     0.0100    0.0039
##      6        1.2731             nan     0.0100    0.0039
##      7        1.2656             nan     0.0100    0.0034
##      8        1.2582             nan     0.0100    0.0034
##      9        1.2510             nan     0.0100    0.0031
##     10        1.2440             nan     0.0100    0.0033
##     20        1.1780             nan     0.0100    0.0032
##     40        1.0692             nan     0.0100    0.0020
##     60        0.9863             nan     0.0100    0.0012
##     80        0.9228             nan     0.0100    0.0012
##    100        0.8716             nan     0.0100    0.0009
##    120        0.8302             nan     0.0100    0.0005
##    140        0.7957             nan     0.0100    0.0002
##    160        0.7674             nan     0.0100    0.0006
##    180        0.7415             nan     0.0100    0.0003
##    200        0.7195             nan     0.0100    0.0001
##    220        0.7003             nan     0.0100   -0.0001
##    240        0.6829             nan     0.0100    0.0001
##    260        0.6663             nan     0.0100   -0.0000
##    280        0.6514             nan     0.0100   -0.0000
##    300        0.6387             nan     0.0100   -0.0001
##    320        0.6262             nan     0.0100    0.0000
##    340        0.6156             nan     0.0100   -0.0000
##    360        0.6058             nan     0.0100    0.0000
##    380        0.5948             nan     0.0100   -0.0001
##    400        0.5846             nan     0.0100   -0.0000
##    420        0.5751             nan     0.0100    0.0001
##    440        0.5666             nan     0.0100    0.0001
##    460        0.5578             nan     0.0100   -0.0002
##    480        0.5494             nan     0.0100   -0.0000
##    500        0.5405             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0041
##      2        1.3041             nan     0.0100    0.0038
##      3        1.2953             nan     0.0100    0.0037
##      4        1.2871             nan     0.0100    0.0037
##      5        1.2791             nan     0.0100    0.0037
##      6        1.2718             nan     0.0100    0.0035
##      7        1.2637             nan     0.0100    0.0035
##      8        1.2557             nan     0.0100    0.0037
##      9        1.2483             nan     0.0100    0.0033
##     10        1.2413             nan     0.0100    0.0034
##     20        1.1759             nan     0.0100    0.0024
##     40        1.0711             nan     0.0100    0.0023
##     60        0.9888             nan     0.0100    0.0012
##     80        0.9243             nan     0.0100    0.0010
##    100        0.8730             nan     0.0100    0.0009
##    120        0.8319             nan     0.0100    0.0009
##    140        0.7970             nan     0.0100    0.0003
##    160        0.7680             nan     0.0100    0.0005
##    180        0.7433             nan     0.0100    0.0003
##    200        0.7212             nan     0.0100    0.0002
##    220        0.7034             nan     0.0100    0.0001
##    240        0.6864             nan     0.0100    0.0002
##    260        0.6715             nan     0.0100    0.0001
##    280        0.6584             nan     0.0100    0.0000
##    300        0.6456             nan     0.0100    0.0000
##    320        0.6340             nan     0.0100    0.0000
##    340        0.6233             nan     0.0100   -0.0001
##    360        0.6132             nan     0.0100    0.0000
##    380        0.6030             nan     0.0100   -0.0001
##    400        0.5929             nan     0.0100   -0.0000
##    420        0.5851             nan     0.0100   -0.0001
##    440        0.5760             nan     0.0100    0.0000
##    460        0.5685             nan     0.0100    0.0000
##    480        0.5601             nan     0.0100   -0.0001
##    500        0.5516             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0037
##      2        1.3046             nan     0.0100    0.0035
##      3        1.2963             nan     0.0100    0.0037
##      4        1.2881             nan     0.0100    0.0038
##      5        1.2805             nan     0.0100    0.0031
##      6        1.2723             nan     0.0100    0.0037
##      7        1.2651             nan     0.0100    0.0033
##      8        1.2575             nan     0.0100    0.0033
##      9        1.2506             nan     0.0100    0.0031
##     10        1.2434             nan     0.0100    0.0029
##     20        1.1769             nan     0.0100    0.0029
##     40        1.0703             nan     0.0100    0.0021
##     60        0.9889             nan     0.0100    0.0015
##     80        0.9268             nan     0.0100    0.0013
##    100        0.8775             nan     0.0100    0.0006
##    120        0.8345             nan     0.0100    0.0008
##    140        0.8018             nan     0.0100    0.0002
##    160        0.7730             nan     0.0100    0.0000
##    180        0.7485             nan     0.0100    0.0003
##    200        0.7268             nan     0.0100    0.0003
##    220        0.7080             nan     0.0100    0.0002
##    240        0.6928             nan     0.0100    0.0001
##    260        0.6776             nan     0.0100    0.0002
##    280        0.6637             nan     0.0100    0.0000
##    300        0.6513             nan     0.0100    0.0000
##    320        0.6405             nan     0.0100    0.0000
##    340        0.6300             nan     0.0100   -0.0001
##    360        0.6204             nan     0.0100   -0.0001
##    380        0.6102             nan     0.0100   -0.0001
##    400        0.6014             nan     0.0100   -0.0002
##    420        0.5917             nan     0.0100    0.0000
##    440        0.5828             nan     0.0100   -0.0000
##    460        0.5740             nan     0.0100   -0.0001
##    480        0.5648             nan     0.0100    0.0001
##    500        0.5564             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0036
##      2        1.3030             nan     0.0100    0.0038
##      3        1.2944             nan     0.0100    0.0037
##      4        1.2856             nan     0.0100    0.0039
##      5        1.2767             nan     0.0100    0.0039
##      6        1.2690             nan     0.0100    0.0033
##      7        1.2602             nan     0.0100    0.0038
##      8        1.2516             nan     0.0100    0.0039
##      9        1.2434             nan     0.0100    0.0036
##     10        1.2360             nan     0.0100    0.0033
##     20        1.1668             nan     0.0100    0.0032
##     40        1.0563             nan     0.0100    0.0019
##     60        0.9712             nan     0.0100    0.0012
##     80        0.9033             nan     0.0100    0.0013
##    100        0.8493             nan     0.0100    0.0009
##    120        0.8056             nan     0.0100    0.0007
##    140        0.7695             nan     0.0100    0.0005
##    160        0.7392             nan     0.0100    0.0003
##    180        0.7120             nan     0.0100    0.0002
##    200        0.6889             nan     0.0100    0.0003
##    220        0.6681             nan     0.0100    0.0003
##    240        0.6495             nan     0.0100   -0.0000
##    260        0.6312             nan     0.0100    0.0001
##    280        0.6156             nan     0.0100    0.0000
##    300        0.6008             nan     0.0100   -0.0000
##    320        0.5876             nan     0.0100   -0.0001
##    340        0.5749             nan     0.0100   -0.0000
##    360        0.5633             nan     0.0100   -0.0000
##    380        0.5512             nan     0.0100    0.0001
##    400        0.5396             nan     0.0100   -0.0000
##    420        0.5294             nan     0.0100   -0.0001
##    440        0.5188             nan     0.0100   -0.0000
##    460        0.5090             nan     0.0100   -0.0000
##    480        0.5002             nan     0.0100   -0.0000
##    500        0.4911             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0035
##      2        1.3037             nan     0.0100    0.0042
##      3        1.2947             nan     0.0100    0.0043
##      4        1.2865             nan     0.0100    0.0033
##      5        1.2789             nan     0.0100    0.0035
##      6        1.2707             nan     0.0100    0.0036
##      7        1.2625             nan     0.0100    0.0039
##      8        1.2552             nan     0.0100    0.0032
##      9        1.2475             nan     0.0100    0.0032
##     10        1.2400             nan     0.0100    0.0031
##     20        1.1696             nan     0.0100    0.0030
##     40        1.0545             nan     0.0100    0.0022
##     60        0.9682             nan     0.0100    0.0013
##     80        0.9008             nan     0.0100    0.0011
##    100        0.8472             nan     0.0100    0.0010
##    120        0.8037             nan     0.0100    0.0005
##    140        0.7672             nan     0.0100    0.0006
##    160        0.7375             nan     0.0100    0.0001
##    180        0.7113             nan     0.0100    0.0002
##    200        0.6904             nan     0.0100    0.0003
##    220        0.6695             nan     0.0100    0.0001
##    240        0.6509             nan     0.0100    0.0001
##    260        0.6346             nan     0.0100    0.0000
##    280        0.6190             nan     0.0100    0.0000
##    300        0.6052             nan     0.0100   -0.0001
##    320        0.5906             nan     0.0100    0.0002
##    340        0.5779             nan     0.0100    0.0000
##    360        0.5668             nan     0.0100    0.0000
##    380        0.5555             nan     0.0100   -0.0001
##    400        0.5443             nan     0.0100   -0.0001
##    420        0.5340             nan     0.0100   -0.0001
##    440        0.5239             nan     0.0100    0.0001
##    460        0.5135             nan     0.0100   -0.0000
##    480        0.5045             nan     0.0100   -0.0002
##    500        0.4953             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3106             nan     0.0100    0.0040
##      2        1.3022             nan     0.0100    0.0036
##      3        1.2944             nan     0.0100    0.0036
##      4        1.2857             nan     0.0100    0.0041
##      5        1.2776             nan     0.0100    0.0033
##      6        1.2704             nan     0.0100    0.0032
##      7        1.2621             nan     0.0100    0.0034
##      8        1.2543             nan     0.0100    0.0034
##      9        1.2465             nan     0.0100    0.0038
##     10        1.2394             nan     0.0100    0.0033
##     20        1.1711             nan     0.0100    0.0030
##     40        1.0624             nan     0.0100    0.0024
##     60        0.9794             nan     0.0100    0.0015
##     80        0.9110             nan     0.0100    0.0012
##    100        0.8564             nan     0.0100    0.0011
##    120        0.8135             nan     0.0100    0.0006
##    140        0.7772             nan     0.0100    0.0003
##    160        0.7476             nan     0.0100    0.0004
##    180        0.7209             nan     0.0100    0.0003
##    200        0.6985             nan     0.0100    0.0004
##    220        0.6794             nan     0.0100    0.0001
##    240        0.6613             nan     0.0100    0.0003
##    260        0.6450             nan     0.0100    0.0000
##    280        0.6302             nan     0.0100   -0.0001
##    300        0.6170             nan     0.0100    0.0001
##    320        0.6044             nan     0.0100   -0.0000
##    340        0.5922             nan     0.0100   -0.0001
##    360        0.5808             nan     0.0100   -0.0000
##    380        0.5710             nan     0.0100   -0.0000
##    400        0.5607             nan     0.0100   -0.0002
##    420        0.5507             nan     0.0100    0.0000
##    440        0.5408             nan     0.0100   -0.0000
##    460        0.5318             nan     0.0100   -0.0001
##    480        0.5220             nan     0.0100    0.0000
##    500        0.5130             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0044
##      2        1.3020             nan     0.0100    0.0039
##      3        1.2940             nan     0.0100    0.0036
##      4        1.2844             nan     0.0100    0.0043
##      5        1.2751             nan     0.0100    0.0043
##      6        1.2668             nan     0.0100    0.0039
##      7        1.2582             nan     0.0100    0.0040
##      8        1.2498             nan     0.0100    0.0034
##      9        1.2413             nan     0.0100    0.0040
##     10        1.2332             nan     0.0100    0.0036
##     20        1.1594             nan     0.0100    0.0031
##     40        1.0430             nan     0.0100    0.0020
##     60        0.9508             nan     0.0100    0.0016
##     80        0.8821             nan     0.0100    0.0011
##    100        0.8247             nan     0.0100    0.0010
##    120        0.7788             nan     0.0100    0.0007
##    140        0.7406             nan     0.0100    0.0003
##    160        0.7073             nan     0.0100    0.0005
##    180        0.6792             nan     0.0100    0.0002
##    200        0.6530             nan     0.0100    0.0003
##    220        0.6305             nan     0.0100    0.0001
##    240        0.6091             nan     0.0100    0.0001
##    260        0.5911             nan     0.0100   -0.0000
##    280        0.5747             nan     0.0100    0.0002
##    300        0.5600             nan     0.0100   -0.0000
##    320        0.5455             nan     0.0100   -0.0000
##    340        0.5323             nan     0.0100    0.0001
##    360        0.5186             nan     0.0100    0.0000
##    380        0.5071             nan     0.0100    0.0001
##    400        0.4947             nan     0.0100    0.0000
##    420        0.4826             nan     0.0100    0.0001
##    440        0.4731             nan     0.0100   -0.0001
##    460        0.4633             nan     0.0100    0.0000
##    480        0.4538             nan     0.0100    0.0000
##    500        0.4433             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3103             nan     0.0100    0.0044
##      2        1.3016             nan     0.0100    0.0042
##      3        1.2922             nan     0.0100    0.0043
##      4        1.2832             nan     0.0100    0.0042
##      5        1.2742             nan     0.0100    0.0043
##      6        1.2655             nan     0.0100    0.0041
##      7        1.2568             nan     0.0100    0.0040
##      8        1.2485             nan     0.0100    0.0031
##      9        1.2401             nan     0.0100    0.0037
##     10        1.2326             nan     0.0100    0.0031
##     20        1.1616             nan     0.0100    0.0028
##     40        1.0441             nan     0.0100    0.0019
##     60        0.9555             nan     0.0100    0.0017
##     80        0.8859             nan     0.0100    0.0012
##    100        0.8278             nan     0.0100    0.0009
##    120        0.7823             nan     0.0100    0.0005
##    140        0.7434             nan     0.0100    0.0003
##    160        0.7108             nan     0.0100    0.0005
##    180        0.6853             nan     0.0100    0.0004
##    200        0.6603             nan     0.0100    0.0002
##    220        0.6388             nan     0.0100    0.0004
##    240        0.6192             nan     0.0100    0.0001
##    260        0.6009             nan     0.0100    0.0001
##    280        0.5851             nan     0.0100    0.0002
##    300        0.5703             nan     0.0100   -0.0001
##    320        0.5570             nan     0.0100    0.0000
##    340        0.5439             nan     0.0100    0.0000
##    360        0.5313             nan     0.0100    0.0000
##    380        0.5190             nan     0.0100   -0.0001
##    400        0.5066             nan     0.0100    0.0001
##    420        0.4956             nan     0.0100   -0.0000
##    440        0.4843             nan     0.0100   -0.0001
##    460        0.4743             nan     0.0100   -0.0000
##    480        0.4638             nan     0.0100   -0.0001
##    500        0.4541             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3125             nan     0.0100    0.0040
##      2        1.3035             nan     0.0100    0.0040
##      3        1.2946             nan     0.0100    0.0039
##      4        1.2862             nan     0.0100    0.0037
##      5        1.2771             nan     0.0100    0.0040
##      6        1.2684             nan     0.0100    0.0039
##      7        1.2604             nan     0.0100    0.0035
##      8        1.2520             nan     0.0100    0.0036
##      9        1.2452             nan     0.0100    0.0033
##     10        1.2373             nan     0.0100    0.0034
##     20        1.1655             nan     0.0100    0.0029
##     40        1.0512             nan     0.0100    0.0022
##     60        0.9640             nan     0.0100    0.0014
##     80        0.8955             nan     0.0100    0.0011
##    100        0.8401             nan     0.0100    0.0010
##    120        0.7951             nan     0.0100    0.0008
##    140        0.7582             nan     0.0100    0.0004
##    160        0.7263             nan     0.0100    0.0003
##    180        0.6989             nan     0.0100    0.0003
##    200        0.6759             nan     0.0100    0.0001
##    220        0.6552             nan     0.0100    0.0001
##    240        0.6349             nan     0.0100   -0.0000
##    260        0.6179             nan     0.0100    0.0001
##    280        0.6022             nan     0.0100   -0.0000
##    300        0.5870             nan     0.0100    0.0001
##    320        0.5723             nan     0.0100    0.0001
##    340        0.5586             nan     0.0100   -0.0001
##    360        0.5459             nan     0.0100    0.0001
##    380        0.5333             nan     0.0100    0.0002
##    400        0.5210             nan     0.0100   -0.0000
##    420        0.5093             nan     0.0100   -0.0001
##    440        0.4979             nan     0.0100    0.0000
##    460        0.4869             nan     0.0100    0.0001
##    480        0.4771             nan     0.0100   -0.0001
##    500        0.4666             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2388             nan     0.1000    0.0382
##      2        1.1687             nan     0.1000    0.0291
##      3        1.1093             nan     0.1000    0.0271
##      4        1.0617             nan     0.1000    0.0211
##      5        1.0170             nan     0.1000    0.0172
##      6        0.9818             nan     0.1000    0.0127
##      7        0.9496             nan     0.1000    0.0123
##      8        0.9178             nan     0.1000    0.0119
##      9        0.8953             nan     0.1000    0.0053
##     10        0.8682             nan     0.1000    0.0116
##     20        0.7182             nan     0.1000    0.0017
##     40        0.5925             nan     0.1000   -0.0002
##     60        0.5183             nan     0.1000   -0.0007
##     80        0.4438             nan     0.1000   -0.0010
##    100        0.3936             nan     0.1000   -0.0001
##    120        0.3578             nan     0.1000   -0.0002
##    140        0.3196             nan     0.1000   -0.0008
##    160        0.2872             nan     0.1000   -0.0008
##    180        0.2562             nan     0.1000   -0.0001
##    200        0.2286             nan     0.1000   -0.0000
##    220        0.2058             nan     0.1000    0.0001
##    240        0.1866             nan     0.1000   -0.0006
##    260        0.1690             nan     0.1000   -0.0001
##    280        0.1538             nan     0.1000   -0.0003
##    300        0.1420             nan     0.1000   -0.0001
##    320        0.1294             nan     0.1000   -0.0002
##    340        0.1167             nan     0.1000   -0.0004
##    360        0.1065             nan     0.1000   -0.0004
##    380        0.0978             nan     0.1000   -0.0000
##    400        0.0904             nan     0.1000   -0.0001
##    420        0.0833             nan     0.1000   -0.0002
##    440        0.0775             nan     0.1000   -0.0001
##    460        0.0713             nan     0.1000   -0.0002
##    480        0.0657             nan     0.1000   -0.0002
##    500        0.0603             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2402             nan     0.1000    0.0364
##      2        1.1751             nan     0.1000    0.0302
##      3        1.1142             nan     0.1000    0.0263
##      4        1.0633             nan     0.1000    0.0221
##      5        1.0186             nan     0.1000    0.0178
##      6        0.9823             nan     0.1000    0.0146
##      7        0.9473             nan     0.1000    0.0137
##      8        0.9139             nan     0.1000    0.0128
##      9        0.8887             nan     0.1000    0.0092
##     10        0.8642             nan     0.1000    0.0112
##     20        0.7215             nan     0.1000    0.0001
##     40        0.5964             nan     0.1000   -0.0009
##     60        0.5168             nan     0.1000   -0.0008
##     80        0.4525             nan     0.1000   -0.0019
##    100        0.4022             nan     0.1000   -0.0012
##    120        0.3577             nan     0.1000   -0.0004
##    140        0.3258             nan     0.1000   -0.0002
##    160        0.2921             nan     0.1000   -0.0003
##    180        0.2600             nan     0.1000   -0.0004
##    200        0.2350             nan     0.1000   -0.0009
##    220        0.2131             nan     0.1000   -0.0004
##    240        0.1928             nan     0.1000   -0.0003
##    260        0.1755             nan     0.1000   -0.0003
##    280        0.1610             nan     0.1000   -0.0004
##    300        0.1464             nan     0.1000   -0.0003
##    320        0.1338             nan     0.1000    0.0003
##    340        0.1246             nan     0.1000   -0.0005
##    360        0.1135             nan     0.1000   -0.0002
##    380        0.1048             nan     0.1000   -0.0004
##    400        0.0958             nan     0.1000   -0.0001
##    420        0.0895             nan     0.1000   -0.0004
##    440        0.0825             nan     0.1000    0.0000
##    460        0.0759             nan     0.1000   -0.0002
##    480        0.0699             nan     0.1000   -0.0003
##    500        0.0635             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2382             nan     0.1000    0.0347
##      2        1.1718             nan     0.1000    0.0305
##      3        1.1166             nan     0.1000    0.0239
##      4        1.0610             nan     0.1000    0.0242
##      5        1.0176             nan     0.1000    0.0176
##      6        0.9864             nan     0.1000    0.0097
##      7        0.9560             nan     0.1000    0.0127
##      8        0.9293             nan     0.1000    0.0099
##      9        0.9029             nan     0.1000    0.0103
##     10        0.8837             nan     0.1000    0.0065
##     20        0.7272             nan     0.1000    0.0031
##     40        0.6028             nan     0.1000   -0.0032
##     60        0.5238             nan     0.1000    0.0004
##     80        0.4603             nan     0.1000    0.0011
##    100        0.4058             nan     0.1000   -0.0012
##    120        0.3628             nan     0.1000   -0.0004
##    140        0.3269             nan     0.1000   -0.0021
##    160        0.2920             nan     0.1000   -0.0009
##    180        0.2631             nan     0.1000   -0.0001
##    200        0.2392             nan     0.1000   -0.0005
##    220        0.2162             nan     0.1000   -0.0005
##    240        0.1951             nan     0.1000   -0.0005
##    260        0.1776             nan     0.1000   -0.0004
##    280        0.1607             nan     0.1000   -0.0004
##    300        0.1482             nan     0.1000   -0.0006
##    320        0.1373             nan     0.1000   -0.0006
##    340        0.1264             nan     0.1000   -0.0006
##    360        0.1165             nan     0.1000   -0.0002
##    380        0.1064             nan     0.1000   -0.0001
##    400        0.0987             nan     0.1000   -0.0002
##    420        0.0919             nan     0.1000   -0.0004
##    440        0.0848             nan     0.1000   -0.0003
##    460        0.0785             nan     0.1000   -0.0001
##    480        0.0729             nan     0.1000   -0.0001
##    500        0.0682             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2341             nan     0.1000    0.0401
##      2        1.1668             nan     0.1000    0.0276
##      3        1.1058             nan     0.1000    0.0220
##      4        1.0497             nan     0.1000    0.0232
##      5        1.0063             nan     0.1000    0.0170
##      6        0.9695             nan     0.1000    0.0128
##      7        0.9327             nan     0.1000    0.0151
##      8        0.9010             nan     0.1000    0.0126
##      9        0.8719             nan     0.1000    0.0108
##     10        0.8454             nan     0.1000    0.0100
##     20        0.6955             nan     0.1000   -0.0002
##     40        0.5435             nan     0.1000    0.0001
##     60        0.4553             nan     0.1000   -0.0008
##     80        0.3799             nan     0.1000   -0.0006
##    100        0.3265             nan     0.1000    0.0005
##    120        0.2851             nan     0.1000    0.0002
##    140        0.2486             nan     0.1000   -0.0008
##    160        0.2181             nan     0.1000   -0.0001
##    180        0.1921             nan     0.1000   -0.0005
##    200        0.1685             nan     0.1000   -0.0005
##    220        0.1483             nan     0.1000   -0.0002
##    240        0.1312             nan     0.1000   -0.0005
##    260        0.1173             nan     0.1000   -0.0001
##    280        0.1025             nan     0.1000   -0.0001
##    300        0.0912             nan     0.1000   -0.0001
##    320        0.0815             nan     0.1000   -0.0001
##    340        0.0738             nan     0.1000   -0.0002
##    360        0.0663             nan     0.1000   -0.0002
##    380        0.0605             nan     0.1000   -0.0001
##    400        0.0538             nan     0.1000   -0.0001
##    420        0.0484             nan     0.1000   -0.0001
##    440        0.0439             nan     0.1000   -0.0002
##    460        0.0396             nan     0.1000   -0.0000
##    480        0.0355             nan     0.1000   -0.0001
##    500        0.0323             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2343             nan     0.1000    0.0394
##      2        1.1589             nan     0.1000    0.0332
##      3        1.1023             nan     0.1000    0.0266
##      4        1.0439             nan     0.1000    0.0236
##      5        0.9985             nan     0.1000    0.0187
##      6        0.9624             nan     0.1000    0.0154
##      7        0.9253             nan     0.1000    0.0148
##      8        0.8961             nan     0.1000    0.0127
##      9        0.8620             nan     0.1000    0.0129
##     10        0.8350             nan     0.1000    0.0115
##     20        0.6897             nan     0.1000    0.0039
##     40        0.5480             nan     0.1000   -0.0001
##     60        0.4652             nan     0.1000    0.0002
##     80        0.3909             nan     0.1000   -0.0007
##    100        0.3384             nan     0.1000   -0.0013
##    120        0.2911             nan     0.1000   -0.0009
##    140        0.2543             nan     0.1000   -0.0006
##    160        0.2229             nan     0.1000   -0.0007
##    180        0.1966             nan     0.1000   -0.0004
##    200        0.1732             nan     0.1000   -0.0007
##    220        0.1556             nan     0.1000   -0.0006
##    240        0.1397             nan     0.1000   -0.0005
##    260        0.1245             nan     0.1000   -0.0007
##    280        0.1125             nan     0.1000   -0.0002
##    300        0.1008             nan     0.1000   -0.0004
##    320        0.0899             nan     0.1000   -0.0005
##    340        0.0810             nan     0.1000   -0.0004
##    360        0.0727             nan     0.1000   -0.0002
##    380        0.0657             nan     0.1000   -0.0001
##    400        0.0597             nan     0.1000   -0.0002
##    420        0.0537             nan     0.1000   -0.0000
##    440        0.0484             nan     0.1000   -0.0000
##    460        0.0436             nan     0.1000   -0.0002
##    480        0.0391             nan     0.1000   -0.0003
##    500        0.0353             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2466             nan     0.1000    0.0337
##      2        1.1764             nan     0.1000    0.0321
##      3        1.1071             nan     0.1000    0.0304
##      4        1.0576             nan     0.1000    0.0220
##      5        1.0138             nan     0.1000    0.0162
##      6        0.9752             nan     0.1000    0.0147
##      7        0.9395             nan     0.1000    0.0125
##      8        0.9079             nan     0.1000    0.0138
##      9        0.8780             nan     0.1000    0.0102
##     10        0.8508             nan     0.1000    0.0105
##     20        0.7052             nan     0.1000    0.0008
##     40        0.5637             nan     0.1000   -0.0000
##     60        0.4737             nan     0.1000    0.0001
##     80        0.4056             nan     0.1000   -0.0006
##    100        0.3529             nan     0.1000   -0.0007
##    120        0.3055             nan     0.1000   -0.0005
##    140        0.2683             nan     0.1000   -0.0009
##    160        0.2381             nan     0.1000   -0.0003
##    180        0.2113             nan     0.1000   -0.0014
##    200        0.1881             nan     0.1000   -0.0009
##    220        0.1704             nan     0.1000   -0.0012
##    240        0.1510             nan     0.1000   -0.0012
##    260        0.1359             nan     0.1000   -0.0006
##    280        0.1198             nan     0.1000   -0.0005
##    300        0.1076             nan     0.1000   -0.0008
##    320        0.0957             nan     0.1000   -0.0002
##    340        0.0850             nan     0.1000   -0.0001
##    360        0.0767             nan     0.1000   -0.0006
##    380        0.0693             nan     0.1000   -0.0002
##    400        0.0624             nan     0.1000   -0.0001
##    420        0.0564             nan     0.1000   -0.0002
##    440        0.0510             nan     0.1000   -0.0001
##    460        0.0452             nan     0.1000   -0.0001
##    480        0.0414             nan     0.1000   -0.0002
##    500        0.0375             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2270             nan     0.1000    0.0407
##      2        1.1506             nan     0.1000    0.0322
##      3        1.0852             nan     0.1000    0.0296
##      4        1.0261             nan     0.1000    0.0241
##      5        0.9796             nan     0.1000    0.0198
##      6        0.9391             nan     0.1000    0.0160
##      7        0.9009             nan     0.1000    0.0155
##      8        0.8721             nan     0.1000    0.0107
##      9        0.8452             nan     0.1000    0.0077
##     10        0.8176             nan     0.1000    0.0081
##     20        0.6542             nan     0.1000    0.0026
##     40        0.5088             nan     0.1000   -0.0005
##     60        0.4113             nan     0.1000   -0.0011
##     80        0.3361             nan     0.1000   -0.0011
##    100        0.2860             nan     0.1000   -0.0004
##    120        0.2418             nan     0.1000   -0.0001
##    140        0.2055             nan     0.1000    0.0002
##    160        0.1771             nan     0.1000   -0.0006
##    180        0.1503             nan     0.1000   -0.0001
##    200        0.1298             nan     0.1000   -0.0004
##    220        0.1128             nan     0.1000   -0.0003
##    240        0.0986             nan     0.1000   -0.0004
##    260        0.0846             nan     0.1000   -0.0003
##    280        0.0740             nan     0.1000   -0.0001
##    300        0.0646             nan     0.1000   -0.0001
##    320        0.0563             nan     0.1000   -0.0001
##    340        0.0495             nan     0.1000   -0.0000
##    360        0.0437             nan     0.1000   -0.0001
##    380        0.0386             nan     0.1000   -0.0001
##    400        0.0342             nan     0.1000   -0.0001
##    420        0.0301             nan     0.1000   -0.0001
##    440        0.0266             nan     0.1000   -0.0001
##    460        0.0237             nan     0.1000   -0.0000
##    480        0.0208             nan     0.1000   -0.0001
##    500        0.0183             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2358             nan     0.1000    0.0375
##      2        1.1630             nan     0.1000    0.0329
##      3        1.1047             nan     0.1000    0.0271
##      4        1.0466             nan     0.1000    0.0279
##      5        0.9928             nan     0.1000    0.0218
##      6        0.9458             nan     0.1000    0.0192
##      7        0.9101             nan     0.1000    0.0136
##      8        0.8782             nan     0.1000    0.0107
##      9        0.8472             nan     0.1000    0.0114
##     10        0.8229             nan     0.1000    0.0097
##     20        0.6627             nan     0.1000    0.0015
##     40        0.5041             nan     0.1000    0.0004
##     60        0.4121             nan     0.1000   -0.0004
##     80        0.3435             nan     0.1000   -0.0011
##    100        0.2873             nan     0.1000    0.0001
##    120        0.2453             nan     0.1000   -0.0012
##    140        0.2119             nan     0.1000   -0.0004
##    160        0.1844             nan     0.1000   -0.0008
##    180        0.1581             nan     0.1000   -0.0003
##    200        0.1380             nan     0.1000   -0.0003
##    220        0.1193             nan     0.1000   -0.0003
##    240        0.1043             nan     0.1000   -0.0001
##    260        0.0901             nan     0.1000   -0.0001
##    280        0.0785             nan     0.1000   -0.0002
##    300        0.0692             nan     0.1000   -0.0002
##    320        0.0604             nan     0.1000   -0.0001
##    340        0.0527             nan     0.1000   -0.0001
##    360        0.0468             nan     0.1000   -0.0002
##    380        0.0410             nan     0.1000   -0.0001
##    400        0.0361             nan     0.1000   -0.0000
##    420        0.0319             nan     0.1000   -0.0001
##    440        0.0281             nan     0.1000   -0.0001
##    460        0.0249             nan     0.1000   -0.0000
##    480        0.0218             nan     0.1000   -0.0000
##    500        0.0191             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2309             nan     0.1000    0.0405
##      2        1.1600             nan     0.1000    0.0298
##      3        1.1009             nan     0.1000    0.0243
##      4        1.0436             nan     0.1000    0.0293
##      5        0.9930             nan     0.1000    0.0208
##      6        0.9514             nan     0.1000    0.0193
##      7        0.9212             nan     0.1000    0.0110
##      8        0.8878             nan     0.1000    0.0122
##      9        0.8601             nan     0.1000    0.0126
##     10        0.8376             nan     0.1000    0.0080
##     20        0.6740             nan     0.1000    0.0017
##     40        0.5316             nan     0.1000    0.0006
##     60        0.4325             nan     0.1000   -0.0009
##     80        0.3571             nan     0.1000    0.0005
##    100        0.3029             nan     0.1000   -0.0006
##    120        0.2578             nan     0.1000   -0.0004
##    140        0.2245             nan     0.1000   -0.0018
##    160        0.1924             nan     0.1000   -0.0005
##    180        0.1682             nan     0.1000   -0.0004
##    200        0.1489             nan     0.1000   -0.0005
##    220        0.1317             nan     0.1000   -0.0006
##    240        0.1138             nan     0.1000   -0.0007
##    260        0.0998             nan     0.1000   -0.0004
##    280        0.0886             nan     0.1000   -0.0001
##    300        0.0770             nan     0.1000   -0.0003
##    320        0.0677             nan     0.1000   -0.0002
##    340        0.0600             nan     0.1000   -0.0001
##    360        0.0529             nan     0.1000   -0.0001
##    380        0.0474             nan     0.1000   -0.0002
##    400        0.0422             nan     0.1000   -0.0002
##    420        0.0368             nan     0.1000   -0.0001
##    440        0.0325             nan     0.1000   -0.0001
##    460        0.0295             nan     0.1000   -0.0002
##    480        0.0259             nan     0.1000   -0.0002
##    500        0.0227             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0003
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0003
##     40        1.2873             nan     0.0010    0.0004
##     60        1.2715             nan     0.0010    0.0004
##     80        1.2565             nan     0.0010    0.0003
##    100        1.2417             nan     0.0010    0.0003
##    120        1.2276             nan     0.0010    0.0003
##    140        1.2138             nan     0.0010    0.0003
##    160        1.2003             nan     0.0010    0.0003
##    180        1.1874             nan     0.0010    0.0003
##    200        1.1748             nan     0.0010    0.0002
##    220        1.1630             nan     0.0010    0.0003
##    240        1.1512             nan     0.0010    0.0002
##    260        1.1395             nan     0.0010    0.0003
##    280        1.1284             nan     0.0010    0.0002
##    300        1.1175             nan     0.0010    0.0003
##    320        1.1072             nan     0.0010    0.0002
##    340        1.0970             nan     0.0010    0.0002
##    360        1.0872             nan     0.0010    0.0002
##    380        1.0777             nan     0.0010    0.0002
##    400        1.0685             nan     0.0010    0.0002
##    420        1.0593             nan     0.0010    0.0002
##    440        1.0504             nan     0.0010    0.0002
##    460        1.0416             nan     0.0010    0.0002
##    480        1.0331             nan     0.0010    0.0002
##    500        1.0249             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0003
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0004
##     20        1.3043             nan     0.0010    0.0004
##     40        1.2880             nan     0.0010    0.0003
##     60        1.2725             nan     0.0010    0.0004
##     80        1.2572             nan     0.0010    0.0004
##    100        1.2428             nan     0.0010    0.0003
##    120        1.2287             nan     0.0010    0.0003
##    140        1.2151             nan     0.0010    0.0003
##    160        1.2019             nan     0.0010    0.0003
##    180        1.1890             nan     0.0010    0.0003
##    200        1.1764             nan     0.0010    0.0003
##    220        1.1641             nan     0.0010    0.0003
##    240        1.1525             nan     0.0010    0.0003
##    260        1.1412             nan     0.0010    0.0002
##    280        1.1300             nan     0.0010    0.0002
##    300        1.1194             nan     0.0010    0.0002
##    320        1.1087             nan     0.0010    0.0002
##    340        1.0985             nan     0.0010    0.0002
##    360        1.0885             nan     0.0010    0.0002
##    380        1.0789             nan     0.0010    0.0002
##    400        1.0699             nan     0.0010    0.0001
##    420        1.0607             nan     0.0010    0.0002
##    440        1.0518             nan     0.0010    0.0002
##    460        1.0432             nan     0.0010    0.0002
##    480        1.0348             nan     0.0010    0.0002
##    500        1.0266             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0003
##     20        1.3039             nan     0.0010    0.0004
##     40        1.2879             nan     0.0010    0.0004
##     60        1.2724             nan     0.0010    0.0003
##     80        1.2575             nan     0.0010    0.0003
##    100        1.2430             nan     0.0010    0.0003
##    120        1.2287             nan     0.0010    0.0003
##    140        1.2151             nan     0.0010    0.0003
##    160        1.2021             nan     0.0010    0.0002
##    180        1.1893             nan     0.0010    0.0002
##    200        1.1768             nan     0.0010    0.0003
##    220        1.1645             nan     0.0010    0.0003
##    240        1.1530             nan     0.0010    0.0002
##    260        1.1415             nan     0.0010    0.0003
##    280        1.1305             nan     0.0010    0.0002
##    300        1.1199             nan     0.0010    0.0002
##    320        1.1096             nan     0.0010    0.0002
##    340        1.0995             nan     0.0010    0.0002
##    360        1.0898             nan     0.0010    0.0002
##    380        1.0803             nan     0.0010    0.0002
##    400        1.0710             nan     0.0010    0.0002
##    420        1.0620             nan     0.0010    0.0002
##    440        1.0533             nan     0.0010    0.0002
##    460        1.0448             nan     0.0010    0.0002
##    480        1.0363             nan     0.0010    0.0002
##    500        1.0281             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0005
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2685             nan     0.0010    0.0004
##     80        1.2524             nan     0.0010    0.0003
##    100        1.2368             nan     0.0010    0.0004
##    120        1.2218             nan     0.0010    0.0003
##    140        1.2070             nan     0.0010    0.0003
##    160        1.1927             nan     0.0010    0.0003
##    180        1.1794             nan     0.0010    0.0003
##    200        1.1660             nan     0.0010    0.0003
##    220        1.1529             nan     0.0010    0.0003
##    240        1.1406             nan     0.0010    0.0003
##    260        1.1286             nan     0.0010    0.0002
##    280        1.1169             nan     0.0010    0.0003
##    300        1.1055             nan     0.0010    0.0002
##    320        1.0943             nan     0.0010    0.0003
##    340        1.0837             nan     0.0010    0.0002
##    360        1.0734             nan     0.0010    0.0002
##    380        1.0632             nan     0.0010    0.0002
##    400        1.0532             nan     0.0010    0.0002
##    420        1.0438             nan     0.0010    0.0002
##    440        1.0344             nan     0.0010    0.0002
##    460        1.0252             nan     0.0010    0.0002
##    480        1.0166             nan     0.0010    0.0002
##    500        1.0080             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0003
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2860             nan     0.0010    0.0004
##     60        1.2692             nan     0.0010    0.0003
##     80        1.2533             nan     0.0010    0.0003
##    100        1.2379             nan     0.0010    0.0003
##    120        1.2230             nan     0.0010    0.0003
##    140        1.2087             nan     0.0010    0.0003
##    160        1.1948             nan     0.0010    0.0003
##    180        1.1811             nan     0.0010    0.0003
##    200        1.1680             nan     0.0010    0.0002
##    220        1.1554             nan     0.0010    0.0003
##    240        1.1430             nan     0.0010    0.0003
##    260        1.1311             nan     0.0010    0.0003
##    280        1.1193             nan     0.0010    0.0003
##    300        1.1079             nan     0.0010    0.0002
##    320        1.0970             nan     0.0010    0.0002
##    340        1.0863             nan     0.0010    0.0002
##    360        1.0756             nan     0.0010    0.0002
##    380        1.0653             nan     0.0010    0.0002
##    400        1.0554             nan     0.0010    0.0002
##    420        1.0460             nan     0.0010    0.0002
##    440        1.0365             nan     0.0010    0.0002
##    460        1.0276             nan     0.0010    0.0002
##    480        1.0190             nan     0.0010    0.0002
##    500        1.0103             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0005
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2862             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0003
##     80        1.2545             nan     0.0010    0.0003
##    100        1.2394             nan     0.0010    0.0003
##    120        1.2250             nan     0.0010    0.0003
##    140        1.2107             nan     0.0010    0.0003
##    160        1.1969             nan     0.0010    0.0003
##    180        1.1834             nan     0.0010    0.0003
##    200        1.1703             nan     0.0010    0.0003
##    220        1.1578             nan     0.0010    0.0003
##    240        1.1455             nan     0.0010    0.0003
##    260        1.1338             nan     0.0010    0.0003
##    280        1.1219             nan     0.0010    0.0003
##    300        1.1106             nan     0.0010    0.0002
##    320        1.0998             nan     0.0010    0.0002
##    340        1.0893             nan     0.0010    0.0003
##    360        1.0789             nan     0.0010    0.0002
##    380        1.0689             nan     0.0010    0.0002
##    400        1.0592             nan     0.0010    0.0002
##    420        1.0496             nan     0.0010    0.0002
##    440        1.0403             nan     0.0010    0.0002
##    460        1.0313             nan     0.0010    0.0002
##    480        1.0225             nan     0.0010    0.0002
##    500        1.0140             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3186             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0005
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3111             nan     0.0010    0.0005
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2841             nan     0.0010    0.0004
##     60        1.2667             nan     0.0010    0.0003
##     80        1.2497             nan     0.0010    0.0003
##    100        1.2336             nan     0.0010    0.0004
##    120        1.2181             nan     0.0010    0.0003
##    140        1.2029             nan     0.0010    0.0003
##    160        1.1880             nan     0.0010    0.0003
##    180        1.1738             nan     0.0010    0.0003
##    200        1.1598             nan     0.0010    0.0003
##    220        1.1467             nan     0.0010    0.0003
##    240        1.1337             nan     0.0010    0.0003
##    260        1.1214             nan     0.0010    0.0003
##    280        1.1088             nan     0.0010    0.0003
##    300        1.0968             nan     0.0010    0.0003
##    320        1.0850             nan     0.0010    0.0003
##    340        1.0739             nan     0.0010    0.0002
##    360        1.0630             nan     0.0010    0.0002
##    380        1.0524             nan     0.0010    0.0002
##    400        1.0421             nan     0.0010    0.0002
##    420        1.0319             nan     0.0010    0.0002
##    440        1.0221             nan     0.0010    0.0002
##    460        1.0127             nan     0.0010    0.0002
##    480        1.0037             nan     0.0010    0.0002
##    500        0.9948             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0005
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0004
##     60        1.2665             nan     0.0010    0.0004
##     80        1.2501             nan     0.0010    0.0004
##    100        1.2340             nan     0.0010    0.0004
##    120        1.2185             nan     0.0010    0.0003
##    140        1.2034             nan     0.0010    0.0003
##    160        1.1886             nan     0.0010    0.0003
##    180        1.1749             nan     0.0010    0.0003
##    200        1.1611             nan     0.0010    0.0003
##    220        1.1478             nan     0.0010    0.0003
##    240        1.1349             nan     0.0010    0.0003
##    260        1.1225             nan     0.0010    0.0003
##    280        1.1105             nan     0.0010    0.0003
##    300        1.0988             nan     0.0010    0.0002
##    320        1.0874             nan     0.0010    0.0003
##    340        1.0766             nan     0.0010    0.0002
##    360        1.0658             nan     0.0010    0.0002
##    380        1.0551             nan     0.0010    0.0002
##    400        1.0450             nan     0.0010    0.0002
##    420        1.0350             nan     0.0010    0.0002
##    440        1.0255             nan     0.0010    0.0002
##    460        1.0161             nan     0.0010    0.0002
##    480        1.0068             nan     0.0010    0.0002
##    500        0.9978             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2684             nan     0.0010    0.0003
##     80        1.2519             nan     0.0010    0.0004
##    100        1.2360             nan     0.0010    0.0003
##    120        1.2206             nan     0.0010    0.0003
##    140        1.2055             nan     0.0010    0.0003
##    160        1.1909             nan     0.0010    0.0003
##    180        1.1774             nan     0.0010    0.0003
##    200        1.1641             nan     0.0010    0.0003
##    220        1.1509             nan     0.0010    0.0003
##    240        1.1384             nan     0.0010    0.0003
##    260        1.1261             nan     0.0010    0.0002
##    280        1.1142             nan     0.0010    0.0003
##    300        1.1025             nan     0.0010    0.0002
##    320        1.0911             nan     0.0010    0.0002
##    340        1.0802             nan     0.0010    0.0002
##    360        1.0696             nan     0.0010    0.0002
##    380        1.0593             nan     0.0010    0.0002
##    400        1.0494             nan     0.0010    0.0002
##    420        1.0398             nan     0.0010    0.0002
##    440        1.0303             nan     0.0010    0.0002
##    460        1.0210             nan     0.0010    0.0002
##    480        1.0120             nan     0.0010    0.0002
##    500        1.0035             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0038
##      2        1.3049             nan     0.0100    0.0037
##      3        1.2969             nan     0.0100    0.0039
##      4        1.2888             nan     0.0100    0.0034
##      5        1.2806             nan     0.0100    0.0037
##      6        1.2719             nan     0.0100    0.0041
##      7        1.2643             nan     0.0100    0.0032
##      8        1.2569             nan     0.0100    0.0033
##      9        1.2502             nan     0.0100    0.0032
##     10        1.2433             nan     0.0100    0.0030
##     20        1.1760             nan     0.0100    0.0025
##     40        1.0683             nan     0.0100    0.0021
##     60        0.9871             nan     0.0100    0.0015
##     80        0.9241             nan     0.0100    0.0009
##    100        0.8714             nan     0.0100    0.0009
##    120        0.8295             nan     0.0100    0.0006
##    140        0.7950             nan     0.0100    0.0003
##    160        0.7657             nan     0.0100    0.0004
##    180        0.7411             nan     0.0100    0.0004
##    200        0.7176             nan     0.0100    0.0001
##    220        0.6976             nan     0.0100    0.0001
##    240        0.6802             nan     0.0100    0.0001
##    260        0.6652             nan     0.0100    0.0001
##    280        0.6519             nan     0.0100    0.0001
##    300        0.6376             nan     0.0100   -0.0000
##    320        0.6249             nan     0.0100   -0.0001
##    340        0.6132             nan     0.0100    0.0001
##    360        0.6028             nan     0.0100    0.0000
##    380        0.5918             nan     0.0100    0.0001
##    400        0.5818             nan     0.0100   -0.0000
##    420        0.5723             nan     0.0100   -0.0000
##    440        0.5634             nan     0.0100   -0.0000
##    460        0.5545             nan     0.0100    0.0000
##    480        0.5465             nan     0.0100    0.0001
##    500        0.5392             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0038
##      2        1.3037             nan     0.0100    0.0038
##      3        1.2952             nan     0.0100    0.0038
##      4        1.2870             nan     0.0100    0.0039
##      5        1.2789             nan     0.0100    0.0039
##      6        1.2712             nan     0.0100    0.0039
##      7        1.2637             nan     0.0100    0.0031
##      8        1.2563             nan     0.0100    0.0034
##      9        1.2493             nan     0.0100    0.0031
##     10        1.2419             nan     0.0100    0.0032
##     20        1.1761             nan     0.0100    0.0027
##     40        1.0692             nan     0.0100    0.0020
##     60        0.9864             nan     0.0100    0.0014
##     80        0.9236             nan     0.0100    0.0012
##    100        0.8723             nan     0.0100    0.0008
##    120        0.8307             nan     0.0100    0.0009
##    140        0.7968             nan     0.0100    0.0005
##    160        0.7672             nan     0.0100    0.0003
##    180        0.7420             nan     0.0100    0.0003
##    200        0.7196             nan     0.0100    0.0003
##    220        0.7016             nan     0.0100    0.0002
##    240        0.6856             nan     0.0100    0.0002
##    260        0.6710             nan     0.0100    0.0002
##    280        0.6567             nan     0.0100    0.0001
##    300        0.6445             nan     0.0100    0.0001
##    320        0.6336             nan     0.0100   -0.0000
##    340        0.6224             nan     0.0100    0.0001
##    360        0.6102             nan     0.0100   -0.0000
##    380        0.6010             nan     0.0100   -0.0001
##    400        0.5909             nan     0.0100    0.0002
##    420        0.5813             nan     0.0100   -0.0000
##    440        0.5719             nan     0.0100   -0.0001
##    460        0.5628             nan     0.0100   -0.0001
##    480        0.5548             nan     0.0100   -0.0001
##    500        0.5458             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3129             nan     0.0100    0.0035
##      2        1.3047             nan     0.0100    0.0039
##      3        1.2964             nan     0.0100    0.0039
##      4        1.2887             nan     0.0100    0.0033
##      5        1.2816             nan     0.0100    0.0033
##      6        1.2735             nan     0.0100    0.0037
##      7        1.2659             nan     0.0100    0.0036
##      8        1.2592             nan     0.0100    0.0033
##      9        1.2518             nan     0.0100    0.0033
##     10        1.2455             nan     0.0100    0.0031
##     20        1.1811             nan     0.0100    0.0031
##     40        1.0736             nan     0.0100    0.0021
##     60        0.9917             nan     0.0100    0.0015
##     80        0.9290             nan     0.0100    0.0012
##    100        0.8779             nan     0.0100    0.0007
##    120        0.8348             nan     0.0100    0.0008
##    140        0.8005             nan     0.0100    0.0006
##    160        0.7711             nan     0.0100    0.0005
##    180        0.7469             nan     0.0100    0.0003
##    200        0.7257             nan     0.0100    0.0003
##    220        0.7070             nan     0.0100    0.0003
##    240        0.6907             nan     0.0100    0.0000
##    260        0.6745             nan     0.0100    0.0001
##    280        0.6613             nan     0.0100    0.0001
##    300        0.6484             nan     0.0100    0.0000
##    320        0.6372             nan     0.0100    0.0000
##    340        0.6257             nan     0.0100    0.0001
##    360        0.6151             nan     0.0100   -0.0000
##    380        0.6051             nan     0.0100   -0.0001
##    400        0.5951             nan     0.0100   -0.0001
##    420        0.5855             nan     0.0100    0.0001
##    440        0.5769             nan     0.0100    0.0001
##    460        0.5684             nan     0.0100   -0.0001
##    480        0.5609             nan     0.0100    0.0000
##    500        0.5523             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0040
##      2        1.3026             nan     0.0100    0.0042
##      3        1.2941             nan     0.0100    0.0035
##      4        1.2853             nan     0.0100    0.0042
##      5        1.2770             nan     0.0100    0.0035
##      6        1.2697             nan     0.0100    0.0034
##      7        1.2615             nan     0.0100    0.0037
##      8        1.2535             nan     0.0100    0.0037
##      9        1.2458             nan     0.0100    0.0036
##     10        1.2381             nan     0.0100    0.0034
##     20        1.1665             nan     0.0100    0.0030
##     40        1.0527             nan     0.0100    0.0021
##     60        0.9678             nan     0.0100    0.0013
##     80        0.9002             nan     0.0100    0.0010
##    100        0.8461             nan     0.0100    0.0008
##    120        0.8025             nan     0.0100    0.0003
##    140        0.7666             nan     0.0100    0.0004
##    160        0.7349             nan     0.0100    0.0005
##    180        0.7082             nan     0.0100    0.0001
##    200        0.6841             nan     0.0100    0.0001
##    220        0.6632             nan     0.0100    0.0003
##    240        0.6453             nan     0.0100    0.0002
##    260        0.6281             nan     0.0100    0.0002
##    280        0.6111             nan     0.0100    0.0001
##    300        0.5970             nan     0.0100    0.0001
##    320        0.5831             nan     0.0100    0.0001
##    340        0.5706             nan     0.0100    0.0001
##    360        0.5586             nan     0.0100    0.0001
##    380        0.5471             nan     0.0100   -0.0000
##    400        0.5364             nan     0.0100   -0.0001
##    420        0.5270             nan     0.0100    0.0000
##    440        0.5167             nan     0.0100    0.0000
##    460        0.5068             nan     0.0100   -0.0000
##    480        0.4976             nan     0.0100   -0.0001
##    500        0.4879             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0041
##      2        1.3032             nan     0.0100    0.0038
##      3        1.2941             nan     0.0100    0.0042
##      4        1.2859             nan     0.0100    0.0036
##      5        1.2774             nan     0.0100    0.0040
##      6        1.2690             nan     0.0100    0.0036
##      7        1.2602             nan     0.0100    0.0040
##      8        1.2519             nan     0.0100    0.0040
##      9        1.2439             nan     0.0100    0.0035
##     10        1.2361             nan     0.0100    0.0030
##     20        1.1680             nan     0.0100    0.0024
##     40        1.0547             nan     0.0100    0.0022
##     60        0.9700             nan     0.0100    0.0015
##     80        0.9026             nan     0.0100    0.0012
##    100        0.8493             nan     0.0100    0.0009
##    120        0.8046             nan     0.0100    0.0007
##    140        0.7685             nan     0.0100    0.0006
##    160        0.7377             nan     0.0100    0.0004
##    180        0.7110             nan     0.0100    0.0003
##    200        0.6885             nan     0.0100    0.0004
##    220        0.6672             nan     0.0100    0.0000
##    240        0.6494             nan     0.0100    0.0002
##    260        0.6330             nan     0.0100    0.0000
##    280        0.6179             nan     0.0100   -0.0001
##    300        0.6055             nan     0.0100    0.0001
##    320        0.5928             nan     0.0100    0.0000
##    340        0.5796             nan     0.0100    0.0000
##    360        0.5671             nan     0.0100   -0.0000
##    380        0.5564             nan     0.0100    0.0001
##    400        0.5461             nan     0.0100   -0.0000
##    420        0.5366             nan     0.0100   -0.0002
##    440        0.5263             nan     0.0100    0.0000
##    460        0.5170             nan     0.0100    0.0001
##    480        0.5075             nan     0.0100   -0.0000
##    500        0.4987             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0041
##      2        1.3026             nan     0.0100    0.0039
##      3        1.2940             nan     0.0100    0.0042
##      4        1.2856             nan     0.0100    0.0042
##      5        1.2772             nan     0.0100    0.0040
##      6        1.2689             nan     0.0100    0.0041
##      7        1.2608             nan     0.0100    0.0037
##      8        1.2531             nan     0.0100    0.0036
##      9        1.2455             nan     0.0100    0.0032
##     10        1.2381             nan     0.0100    0.0035
##     20        1.1700             nan     0.0100    0.0029
##     40        1.0587             nan     0.0100    0.0019
##     60        0.9729             nan     0.0100    0.0013
##     80        0.9059             nan     0.0100    0.0013
##    100        0.8523             nan     0.0100    0.0006
##    120        0.8083             nan     0.0100    0.0004
##    140        0.7724             nan     0.0100    0.0007
##    160        0.7412             nan     0.0100    0.0005
##    180        0.7152             nan     0.0100   -0.0001
##    200        0.6921             nan     0.0100    0.0002
##    220        0.6712             nan     0.0100    0.0003
##    240        0.6530             nan     0.0100   -0.0001
##    260        0.6378             nan     0.0100    0.0001
##    280        0.6226             nan     0.0100   -0.0000
##    300        0.6095             nan     0.0100    0.0001
##    320        0.5971             nan     0.0100   -0.0002
##    340        0.5830             nan     0.0100   -0.0000
##    360        0.5706             nan     0.0100   -0.0001
##    380        0.5603             nan     0.0100   -0.0000
##    400        0.5489             nan     0.0100    0.0001
##    420        0.5386             nan     0.0100   -0.0001
##    440        0.5291             nan     0.0100    0.0000
##    460        0.5201             nan     0.0100    0.0000
##    480        0.5113             nan     0.0100   -0.0001
##    500        0.5027             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0041
##      2        1.3021             nan     0.0100    0.0042
##      3        1.2927             nan     0.0100    0.0042
##      4        1.2842             nan     0.0100    0.0041
##      5        1.2762             nan     0.0100    0.0034
##      6        1.2672             nan     0.0100    0.0039
##      7        1.2587             nan     0.0100    0.0040
##      8        1.2504             nan     0.0100    0.0038
##      9        1.2425             nan     0.0100    0.0036
##     10        1.2339             nan     0.0100    0.0037
##     20        1.1602             nan     0.0100    0.0028
##     40        1.0430             nan     0.0100    0.0020
##     60        0.9540             nan     0.0100    0.0016
##     80        0.8839             nan     0.0100    0.0010
##    100        0.8287             nan     0.0100    0.0009
##    120        0.7829             nan     0.0100    0.0006
##    140        0.7430             nan     0.0100    0.0005
##    160        0.7100             nan     0.0100    0.0003
##    180        0.6816             nan     0.0100    0.0002
##    200        0.6566             nan     0.0100    0.0001
##    220        0.6339             nan     0.0100    0.0000
##    240        0.6130             nan     0.0100    0.0000
##    260        0.5939             nan     0.0100    0.0000
##    280        0.5769             nan     0.0100    0.0001
##    300        0.5621             nan     0.0100    0.0000
##    320        0.5483             nan     0.0100   -0.0000
##    340        0.5333             nan     0.0100    0.0001
##    360        0.5201             nan     0.0100   -0.0000
##    380        0.5073             nan     0.0100    0.0000
##    400        0.4955             nan     0.0100    0.0001
##    420        0.4844             nan     0.0100    0.0000
##    440        0.4740             nan     0.0100    0.0000
##    460        0.4641             nan     0.0100    0.0001
##    480        0.4543             nan     0.0100   -0.0003
##    500        0.4447             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3106             nan     0.0100    0.0042
##      2        1.3015             nan     0.0100    0.0046
##      3        1.2923             nan     0.0100    0.0042
##      4        1.2832             nan     0.0100    0.0042
##      5        1.2742             nan     0.0100    0.0038
##      6        1.2660             nan     0.0100    0.0040
##      7        1.2578             nan     0.0100    0.0037
##      8        1.2498             nan     0.0100    0.0037
##      9        1.2414             nan     0.0100    0.0040
##     10        1.2337             nan     0.0100    0.0036
##     20        1.1602             nan     0.0100    0.0030
##     40        1.0442             nan     0.0100    0.0018
##     60        0.9556             nan     0.0100    0.0017
##     80        0.8865             nan     0.0100    0.0013
##    100        0.8326             nan     0.0100    0.0008
##    120        0.7860             nan     0.0100    0.0007
##    140        0.7476             nan     0.0100    0.0006
##    160        0.7151             nan     0.0100    0.0004
##    180        0.6870             nan     0.0100    0.0003
##    200        0.6629             nan     0.0100    0.0000
##    220        0.6408             nan     0.0100    0.0002
##    240        0.6206             nan     0.0100    0.0001
##    260        0.6024             nan     0.0100    0.0001
##    280        0.5843             nan     0.0100    0.0002
##    300        0.5680             nan     0.0100    0.0001
##    320        0.5538             nan     0.0100    0.0000
##    340        0.5392             nan     0.0100   -0.0000
##    360        0.5260             nan     0.0100    0.0001
##    380        0.5148             nan     0.0100    0.0001
##    400        0.5018             nan     0.0100    0.0000
##    420        0.4902             nan     0.0100    0.0000
##    440        0.4786             nan     0.0100   -0.0001
##    460        0.4682             nan     0.0100   -0.0001
##    480        0.4580             nan     0.0100   -0.0001
##    500        0.4486             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3108             nan     0.0100    0.0043
##      2        1.3019             nan     0.0100    0.0039
##      3        1.2936             nan     0.0100    0.0035
##      4        1.2849             nan     0.0100    0.0042
##      5        1.2759             nan     0.0100    0.0038
##      6        1.2665             nan     0.0100    0.0044
##      7        1.2585             nan     0.0100    0.0037
##      8        1.2501             nan     0.0100    0.0037
##      9        1.2419             nan     0.0100    0.0039
##     10        1.2341             nan     0.0100    0.0038
##     20        1.1620             nan     0.0100    0.0027
##     40        1.0472             nan     0.0100    0.0021
##     60        0.9606             nan     0.0100    0.0017
##     80        0.8927             nan     0.0100    0.0013
##    100        0.8378             nan     0.0100    0.0009
##    120        0.7948             nan     0.0100    0.0007
##    140        0.7575             nan     0.0100    0.0006
##    160        0.7250             nan     0.0100    0.0003
##    180        0.6976             nan     0.0100    0.0002
##    200        0.6729             nan     0.0100    0.0003
##    220        0.6503             nan     0.0100    0.0002
##    240        0.6305             nan     0.0100    0.0001
##    260        0.6126             nan     0.0100    0.0000
##    280        0.5961             nan     0.0100    0.0001
##    300        0.5808             nan     0.0100    0.0001
##    320        0.5664             nan     0.0100    0.0001
##    340        0.5521             nan     0.0100    0.0001
##    360        0.5389             nan     0.0100   -0.0000
##    380        0.5266             nan     0.0100    0.0001
##    400        0.5143             nan     0.0100    0.0002
##    420        0.5034             nan     0.0100   -0.0001
##    440        0.4923             nan     0.0100    0.0002
##    460        0.4825             nan     0.0100   -0.0001
##    480        0.4727             nan     0.0100   -0.0001
##    500        0.4638             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2483             nan     0.1000    0.0306
##      2        1.1826             nan     0.1000    0.0268
##      3        1.1189             nan     0.1000    0.0292
##      4        1.0711             nan     0.1000    0.0230
##      5        1.0335             nan     0.1000    0.0149
##      6        0.9966             nan     0.1000    0.0168
##      7        0.9630             nan     0.1000    0.0138
##      8        0.9302             nan     0.1000    0.0130
##      9        0.8994             nan     0.1000    0.0111
##     10        0.8727             nan     0.1000    0.0105
##     20        0.7253             nan     0.1000    0.0006
##     40        0.5918             nan     0.1000   -0.0014
##     60        0.5021             nan     0.1000   -0.0016
##     80        0.4391             nan     0.1000   -0.0009
##    100        0.3877             nan     0.1000   -0.0002
##    120        0.3458             nan     0.1000   -0.0008
##    140        0.3072             nan     0.1000   -0.0005
##    160        0.2721             nan     0.1000   -0.0004
##    180        0.2441             nan     0.1000   -0.0001
##    200        0.2215             nan     0.1000   -0.0002
##    220        0.1999             nan     0.1000    0.0001
##    240        0.1821             nan     0.1000   -0.0001
##    260        0.1663             nan     0.1000   -0.0003
##    280        0.1533             nan     0.1000   -0.0005
##    300        0.1399             nan     0.1000    0.0001
##    320        0.1278             nan     0.1000   -0.0004
##    340        0.1169             nan     0.1000   -0.0001
##    360        0.1067             nan     0.1000   -0.0000
##    380        0.0985             nan     0.1000   -0.0001
##    400        0.0909             nan     0.1000    0.0000
##    420        0.0836             nan     0.1000    0.0001
##    440        0.0775             nan     0.1000   -0.0003
##    460        0.0708             nan     0.1000   -0.0001
##    480        0.0654             nan     0.1000   -0.0003
##    500        0.0605             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2383             nan     0.1000    0.0402
##      2        1.1679             nan     0.1000    0.0306
##      3        1.1096             nan     0.1000    0.0246
##      4        1.0571             nan     0.1000    0.0252
##      5        1.0135             nan     0.1000    0.0163
##      6        0.9796             nan     0.1000    0.0145
##      7        0.9468             nan     0.1000    0.0131
##      8        0.9171             nan     0.1000    0.0122
##      9        0.8901             nan     0.1000    0.0116
##     10        0.8697             nan     0.1000    0.0075
##     20        0.7178             nan     0.1000    0.0022
##     40        0.5903             nan     0.1000    0.0004
##     60        0.5104             nan     0.1000   -0.0007
##     80        0.4429             nan     0.1000    0.0004
##    100        0.3955             nan     0.1000   -0.0011
##    120        0.3543             nan     0.1000   -0.0013
##    140        0.3172             nan     0.1000    0.0004
##    160        0.2857             nan     0.1000   -0.0011
##    180        0.2617             nan     0.1000   -0.0004
##    200        0.2372             nan     0.1000    0.0002
##    220        0.2126             nan     0.1000   -0.0003
##    240        0.1948             nan     0.1000   -0.0007
##    260        0.1785             nan     0.1000   -0.0009
##    280        0.1622             nan     0.1000   -0.0006
##    300        0.1493             nan     0.1000   -0.0004
##    320        0.1396             nan     0.1000   -0.0003
##    340        0.1279             nan     0.1000   -0.0000
##    360        0.1176             nan     0.1000   -0.0003
##    380        0.1090             nan     0.1000   -0.0006
##    400        0.1000             nan     0.1000   -0.0002
##    420        0.0930             nan     0.1000   -0.0002
##    440        0.0855             nan     0.1000   -0.0002
##    460        0.0784             nan     0.1000   -0.0002
##    480        0.0731             nan     0.1000   -0.0003
##    500        0.0670             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2405             nan     0.1000    0.0386
##      2        1.1708             nan     0.1000    0.0287
##      3        1.1129             nan     0.1000    0.0265
##      4        1.0596             nan     0.1000    0.0223
##      5        1.0126             nan     0.1000    0.0213
##      6        0.9773             nan     0.1000    0.0144
##      7        0.9474             nan     0.1000    0.0112
##      8        0.9179             nan     0.1000    0.0127
##      9        0.8930             nan     0.1000    0.0106
##     10        0.8675             nan     0.1000    0.0087
##     20        0.7225             nan     0.1000    0.0021
##     40        0.5959             nan     0.1000   -0.0010
##     60        0.5162             nan     0.1000   -0.0005
##     80        0.4660             nan     0.1000   -0.0007
##    100        0.4098             nan     0.1000   -0.0012
##    120        0.3703             nan     0.1000   -0.0001
##    140        0.3327             nan     0.1000   -0.0010
##    160        0.3024             nan     0.1000   -0.0004
##    180        0.2746             nan     0.1000   -0.0008
##    200        0.2473             nan     0.1000   -0.0001
##    220        0.2245             nan     0.1000   -0.0006
##    240        0.2046             nan     0.1000   -0.0011
##    260        0.1849             nan     0.1000   -0.0004
##    280        0.1696             nan     0.1000   -0.0006
##    300        0.1545             nan     0.1000   -0.0004
##    320        0.1429             nan     0.1000   -0.0005
##    340        0.1324             nan     0.1000   -0.0006
##    360        0.1224             nan     0.1000   -0.0002
##    380        0.1129             nan     0.1000   -0.0004
##    400        0.1044             nan     0.1000   -0.0004
##    420        0.0976             nan     0.1000   -0.0003
##    440        0.0910             nan     0.1000   -0.0006
##    460        0.0839             nan     0.1000   -0.0003
##    480        0.0783             nan     0.1000   -0.0002
##    500        0.0724             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2361             nan     0.1000    0.0385
##      2        1.1581             nan     0.1000    0.0322
##      3        1.0994             nan     0.1000    0.0239
##      4        1.0521             nan     0.1000    0.0210
##      5        1.0070             nan     0.1000    0.0177
##      6        0.9646             nan     0.1000    0.0179
##      7        0.9290             nan     0.1000    0.0133
##      8        0.8963             nan     0.1000    0.0125
##      9        0.8661             nan     0.1000    0.0111
##     10        0.8413             nan     0.1000    0.0102
##     20        0.6926             nan     0.1000    0.0006
##     40        0.5433             nan     0.1000    0.0003
##     60        0.4517             nan     0.1000    0.0007
##     80        0.3828             nan     0.1000   -0.0006
##    100        0.3256             nan     0.1000   -0.0010
##    120        0.2820             nan     0.1000   -0.0007
##    140        0.2463             nan     0.1000   -0.0003
##    160        0.2157             nan     0.1000   -0.0003
##    180        0.1909             nan     0.1000   -0.0006
##    200        0.1690             nan     0.1000    0.0001
##    220        0.1505             nan     0.1000   -0.0004
##    240        0.1343             nan     0.1000    0.0004
##    260        0.1184             nan     0.1000   -0.0004
##    280        0.1058             nan     0.1000   -0.0003
##    300        0.0957             nan     0.1000   -0.0003
##    320        0.0861             nan     0.1000   -0.0001
##    340        0.0776             nan     0.1000   -0.0001
##    360        0.0702             nan     0.1000   -0.0004
##    380        0.0635             nan     0.1000    0.0000
##    400        0.0576             nan     0.1000   -0.0001
##    420        0.0523             nan     0.1000   -0.0001
##    440        0.0480             nan     0.1000   -0.0002
##    460        0.0435             nan     0.1000   -0.0001
##    480        0.0394             nan     0.1000   -0.0000
##    500        0.0360             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2369             nan     0.1000    0.0357
##      2        1.1730             nan     0.1000    0.0261
##      3        1.1129             nan     0.1000    0.0283
##      4        1.0593             nan     0.1000    0.0264
##      5        1.0148             nan     0.1000    0.0194
##      6        0.9754             nan     0.1000    0.0163
##      7        0.9439             nan     0.1000    0.0129
##      8        0.9099             nan     0.1000    0.0139
##      9        0.8850             nan     0.1000    0.0089
##     10        0.8606             nan     0.1000    0.0100
##     20        0.6933             nan     0.1000    0.0024
##     40        0.5384             nan     0.1000   -0.0003
##     60        0.4508             nan     0.1000   -0.0010
##     80        0.3819             nan     0.1000   -0.0000
##    100        0.3260             nan     0.1000   -0.0016
##    120        0.2834             nan     0.1000   -0.0007
##    140        0.2492             nan     0.1000   -0.0014
##    160        0.2154             nan     0.1000   -0.0007
##    180        0.1901             nan     0.1000   -0.0003
##    200        0.1668             nan     0.1000   -0.0004
##    220        0.1491             nan     0.1000   -0.0006
##    240        0.1331             nan     0.1000   -0.0006
##    260        0.1190             nan     0.1000   -0.0001
##    280        0.1059             nan     0.1000   -0.0004
##    300        0.0932             nan     0.1000   -0.0000
##    320        0.0846             nan     0.1000   -0.0006
##    340        0.0757             nan     0.1000   -0.0003
##    360        0.0688             nan     0.1000   -0.0002
##    380        0.0624             nan     0.1000   -0.0001
##    400        0.0559             nan     0.1000    0.0000
##    420        0.0509             nan     0.1000   -0.0003
##    440        0.0456             nan     0.1000   -0.0001
##    460        0.0418             nan     0.1000   -0.0002
##    480        0.0382             nan     0.1000   -0.0001
##    500        0.0348             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2411             nan     0.1000    0.0364
##      2        1.1674             nan     0.1000    0.0301
##      3        1.1078             nan     0.1000    0.0268
##      4        1.0594             nan     0.1000    0.0184
##      5        1.0101             nan     0.1000    0.0223
##      6        0.9692             nan     0.1000    0.0150
##      7        0.9322             nan     0.1000    0.0150
##      8        0.9005             nan     0.1000    0.0105
##      9        0.8757             nan     0.1000    0.0092
##     10        0.8509             nan     0.1000    0.0104
##     20        0.6978             nan     0.1000    0.0037
##     40        0.5521             nan     0.1000   -0.0003
##     60        0.4612             nan     0.1000    0.0006
##     80        0.4010             nan     0.1000   -0.0009
##    100        0.3486             nan     0.1000   -0.0014
##    120        0.3023             nan     0.1000   -0.0005
##    140        0.2687             nan     0.1000   -0.0014
##    160        0.2411             nan     0.1000   -0.0008
##    180        0.2162             nan     0.1000   -0.0004
##    200        0.1938             nan     0.1000   -0.0006
##    220        0.1769             nan     0.1000   -0.0004
##    240        0.1591             nan     0.1000   -0.0009
##    260        0.1424             nan     0.1000   -0.0005
##    280        0.1274             nan     0.1000   -0.0007
##    300        0.1145             nan     0.1000   -0.0005
##    320        0.1045             nan     0.1000   -0.0003
##    340        0.0946             nan     0.1000   -0.0005
##    360        0.0867             nan     0.1000   -0.0005
##    380        0.0780             nan     0.1000   -0.0004
##    400        0.0706             nan     0.1000   -0.0001
##    420        0.0649             nan     0.1000   -0.0003
##    440        0.0590             nan     0.1000   -0.0001
##    460        0.0542             nan     0.1000   -0.0003
##    480        0.0490             nan     0.1000   -0.0002
##    500        0.0442             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2318             nan     0.1000    0.0392
##      2        1.1556             nan     0.1000    0.0354
##      3        1.0920             nan     0.1000    0.0266
##      4        1.0376             nan     0.1000    0.0239
##      5        0.9931             nan     0.1000    0.0164
##      6        0.9529             nan     0.1000    0.0104
##      7        0.9194             nan     0.1000    0.0121
##      8        0.8903             nan     0.1000    0.0110
##      9        0.8580             nan     0.1000    0.0143
##     10        0.8321             nan     0.1000    0.0107
##     20        0.6625             nan     0.1000    0.0012
##     40        0.5012             nan     0.1000   -0.0009
##     60        0.4067             nan     0.1000   -0.0002
##     80        0.3337             nan     0.1000   -0.0004
##    100        0.2755             nan     0.1000   -0.0014
##    120        0.2341             nan     0.1000    0.0009
##    140        0.2010             nan     0.1000   -0.0002
##    160        0.1710             nan     0.1000   -0.0005
##    180        0.1482             nan     0.1000   -0.0006
##    200        0.1288             nan     0.1000   -0.0004
##    220        0.1120             nan     0.1000   -0.0005
##    240        0.0982             nan     0.1000   -0.0001
##    260        0.0869             nan     0.1000   -0.0005
##    280        0.0767             nan     0.1000   -0.0004
##    300        0.0684             nan     0.1000   -0.0003
##    320        0.0605             nan     0.1000   -0.0001
##    340        0.0531             nan     0.1000   -0.0001
##    360        0.0470             nan     0.1000   -0.0002
##    380        0.0412             nan     0.1000   -0.0000
##    400        0.0371             nan     0.1000   -0.0001
##    420        0.0326             nan     0.1000   -0.0001
##    440        0.0289             nan     0.1000   -0.0000
##    460        0.0259             nan     0.1000   -0.0000
##    480        0.0233             nan     0.1000   -0.0000
##    500        0.0208             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2288             nan     0.1000    0.0399
##      2        1.1572             nan     0.1000    0.0328
##      3        1.0878             nan     0.1000    0.0307
##      4        1.0359             nan     0.1000    0.0214
##      5        0.9871             nan     0.1000    0.0211
##      6        0.9441             nan     0.1000    0.0177
##      7        0.9103             nan     0.1000    0.0140
##      8        0.8803             nan     0.1000    0.0092
##      9        0.8450             nan     0.1000    0.0125
##     10        0.8216             nan     0.1000    0.0089
##     20        0.6622             nan     0.1000    0.0015
##     40        0.5085             nan     0.1000    0.0006
##     60        0.4171             nan     0.1000   -0.0011
##     80        0.3406             nan     0.1000   -0.0019
##    100        0.2853             nan     0.1000   -0.0006
##    120        0.2429             nan     0.1000   -0.0009
##    140        0.2099             nan     0.1000   -0.0008
##    160        0.1835             nan     0.1000    0.0002
##    180        0.1570             nan     0.1000    0.0000
##    200        0.1355             nan     0.1000   -0.0004
##    220        0.1181             nan     0.1000   -0.0005
##    240        0.1036             nan     0.1000    0.0001
##    260        0.0903             nan     0.1000   -0.0002
##    280        0.0782             nan     0.1000   -0.0003
##    300        0.0689             nan     0.1000   -0.0005
##    320        0.0615             nan     0.1000   -0.0002
##    340        0.0537             nan     0.1000   -0.0002
##    360        0.0475             nan     0.1000   -0.0001
##    380        0.0419             nan     0.1000   -0.0002
##    400        0.0375             nan     0.1000   -0.0001
##    420        0.0331             nan     0.1000   -0.0002
##    440        0.0294             nan     0.1000   -0.0001
##    460        0.0264             nan     0.1000   -0.0001
##    480        0.0232             nan     0.1000   -0.0001
##    500        0.0208             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2368             nan     0.1000    0.0403
##      2        1.1601             nan     0.1000    0.0285
##      3        1.1013             nan     0.1000    0.0274
##      4        1.0415             nan     0.1000    0.0241
##      5        0.9961             nan     0.1000    0.0159
##      6        0.9587             nan     0.1000    0.0161
##      7        0.9198             nan     0.1000    0.0137
##      8        0.8826             nan     0.1000    0.0143
##      9        0.8552             nan     0.1000    0.0089
##     10        0.8311             nan     0.1000    0.0083
##     20        0.6761             nan     0.1000    0.0034
##     40        0.5186             nan     0.1000    0.0008
##     60        0.4242             nan     0.1000   -0.0013
##     80        0.3561             nan     0.1000   -0.0019
##    100        0.3020             nan     0.1000   -0.0014
##    120        0.2617             nan     0.1000   -0.0012
##    140        0.2262             nan     0.1000   -0.0005
##    160        0.1930             nan     0.1000   -0.0011
##    180        0.1657             nan     0.1000   -0.0007
##    200        0.1460             nan     0.1000   -0.0003
##    220        0.1272             nan     0.1000   -0.0004
##    240        0.1115             nan     0.1000   -0.0008
##    260        0.0983             nan     0.1000   -0.0003
##    280        0.0867             nan     0.1000   -0.0001
##    300        0.0783             nan     0.1000   -0.0005
##    320        0.0691             nan     0.1000   -0.0001
##    340        0.0622             nan     0.1000   -0.0002
##    360        0.0544             nan     0.1000   -0.0001
##    380        0.0482             nan     0.1000   -0.0002
##    400        0.0426             nan     0.1000   -0.0002
##    420        0.0379             nan     0.1000   -0.0001
##    440        0.0336             nan     0.1000   -0.0001
##    460        0.0302             nan     0.1000   -0.0002
##    480        0.0268             nan     0.1000   -0.0002
##    500        0.0241             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3191             nan     0.0010    0.0003
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3042             nan     0.0010    0.0003
##     40        1.2885             nan     0.0010    0.0004
##     60        1.2730             nan     0.0010    0.0004
##     80        1.2583             nan     0.0010    0.0003
##    100        1.2437             nan     0.0010    0.0003
##    120        1.2299             nan     0.0010    0.0003
##    140        1.2168             nan     0.0010    0.0003
##    160        1.2041             nan     0.0010    0.0003
##    180        1.1914             nan     0.0010    0.0003
##    200        1.1793             nan     0.0010    0.0003
##    220        1.1675             nan     0.0010    0.0002
##    240        1.1561             nan     0.0010    0.0003
##    260        1.1451             nan     0.0010    0.0002
##    280        1.1343             nan     0.0010    0.0002
##    300        1.1239             nan     0.0010    0.0002
##    320        1.1137             nan     0.0010    0.0002
##    340        1.1034             nan     0.0010    0.0002
##    360        1.0939             nan     0.0010    0.0002
##    380        1.0847             nan     0.0010    0.0002
##    400        1.0755             nan     0.0010    0.0002
##    420        1.0665             nan     0.0010    0.0002
##    440        1.0580             nan     0.0010    0.0002
##    460        1.0494             nan     0.0010    0.0002
##    480        1.0413             nan     0.0010    0.0002
##    500        1.0333             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0003
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3043             nan     0.0010    0.0003
##     40        1.2889             nan     0.0010    0.0004
##     60        1.2737             nan     0.0010    0.0004
##     80        1.2589             nan     0.0010    0.0003
##    100        1.2446             nan     0.0010    0.0004
##    120        1.2311             nan     0.0010    0.0003
##    140        1.2179             nan     0.0010    0.0002
##    160        1.2048             nan     0.0010    0.0003
##    180        1.1925             nan     0.0010    0.0002
##    200        1.1804             nan     0.0010    0.0003
##    220        1.1688             nan     0.0010    0.0003
##    240        1.1572             nan     0.0010    0.0002
##    260        1.1461             nan     0.0010    0.0002
##    280        1.1356             nan     0.0010    0.0002
##    300        1.1251             nan     0.0010    0.0002
##    320        1.1149             nan     0.0010    0.0002
##    340        1.1047             nan     0.0010    0.0002
##    360        1.0949             nan     0.0010    0.0002
##    380        1.0855             nan     0.0010    0.0002
##    400        1.0763             nan     0.0010    0.0002
##    420        1.0672             nan     0.0010    0.0002
##    440        1.0587             nan     0.0010    0.0002
##    460        1.0503             nan     0.0010    0.0002
##    480        1.0419             nan     0.0010    0.0002
##    500        1.0338             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0003
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0003
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3125             nan     0.0010    0.0003
##     20        1.3046             nan     0.0010    0.0003
##     40        1.2887             nan     0.0010    0.0003
##     60        1.2735             nan     0.0010    0.0003
##     80        1.2590             nan     0.0010    0.0003
##    100        1.2451             nan     0.0010    0.0003
##    120        1.2313             nan     0.0010    0.0003
##    140        1.2179             nan     0.0010    0.0003
##    160        1.2049             nan     0.0010    0.0003
##    180        1.1925             nan     0.0010    0.0003
##    200        1.1802             nan     0.0010    0.0003
##    220        1.1683             nan     0.0010    0.0003
##    240        1.1571             nan     0.0010    0.0003
##    260        1.1460             nan     0.0010    0.0003
##    280        1.1353             nan     0.0010    0.0002
##    300        1.1248             nan     0.0010    0.0002
##    320        1.1147             nan     0.0010    0.0002
##    340        1.1048             nan     0.0010    0.0002
##    360        1.0951             nan     0.0010    0.0002
##    380        1.0859             nan     0.0010    0.0002
##    400        1.0768             nan     0.0010    0.0002
##    420        1.0680             nan     0.0010    0.0002
##    440        1.0592             nan     0.0010    0.0002
##    460        1.0510             nan     0.0010    0.0002
##    480        1.0428             nan     0.0010    0.0002
##    500        1.0348             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0003
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2863             nan     0.0010    0.0004
##     60        1.2700             nan     0.0010    0.0004
##     80        1.2542             nan     0.0010    0.0003
##    100        1.2385             nan     0.0010    0.0004
##    120        1.2237             nan     0.0010    0.0004
##    140        1.2095             nan     0.0010    0.0003
##    160        1.1956             nan     0.0010    0.0003
##    180        1.1822             nan     0.0010    0.0003
##    200        1.1693             nan     0.0010    0.0003
##    220        1.1568             nan     0.0010    0.0003
##    240        1.1446             nan     0.0010    0.0002
##    260        1.1327             nan     0.0010    0.0003
##    280        1.1211             nan     0.0010    0.0002
##    300        1.1096             nan     0.0010    0.0003
##    320        1.0988             nan     0.0010    0.0002
##    340        1.0884             nan     0.0010    0.0002
##    360        1.0781             nan     0.0010    0.0002
##    380        1.0684             nan     0.0010    0.0002
##    400        1.0586             nan     0.0010    0.0002
##    420        1.0493             nan     0.0010    0.0002
##    440        1.0400             nan     0.0010    0.0002
##    460        1.0311             nan     0.0010    0.0002
##    480        1.0223             nan     0.0010    0.0002
##    500        1.0138             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0003
##     60        1.2703             nan     0.0010    0.0004
##     80        1.2549             nan     0.0010    0.0003
##    100        1.2399             nan     0.0010    0.0003
##    120        1.2252             nan     0.0010    0.0003
##    140        1.2112             nan     0.0010    0.0003
##    160        1.1973             nan     0.0010    0.0003
##    180        1.1840             nan     0.0010    0.0003
##    200        1.1712             nan     0.0010    0.0003
##    220        1.1588             nan     0.0010    0.0003
##    240        1.1465             nan     0.0010    0.0003
##    260        1.1346             nan     0.0010    0.0003
##    280        1.1229             nan     0.0010    0.0003
##    300        1.1116             nan     0.0010    0.0002
##    320        1.1008             nan     0.0010    0.0002
##    340        1.0901             nan     0.0010    0.0002
##    360        1.0797             nan     0.0010    0.0002
##    380        1.0697             nan     0.0010    0.0003
##    400        1.0601             nan     0.0010    0.0002
##    420        1.0506             nan     0.0010    0.0002
##    440        1.0410             nan     0.0010    0.0002
##    460        1.0319             nan     0.0010    0.0002
##    480        1.0235             nan     0.0010    0.0002
##    500        1.0149             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0003
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2871             nan     0.0010    0.0004
##     60        1.2710             nan     0.0010    0.0003
##     80        1.2554             nan     0.0010    0.0003
##    100        1.2401             nan     0.0010    0.0003
##    120        1.2257             nan     0.0010    0.0003
##    140        1.2118             nan     0.0010    0.0003
##    160        1.1982             nan     0.0010    0.0003
##    180        1.1852             nan     0.0010    0.0003
##    200        1.1724             nan     0.0010    0.0003
##    220        1.1600             nan     0.0010    0.0003
##    240        1.1479             nan     0.0010    0.0003
##    260        1.1361             nan     0.0010    0.0003
##    280        1.1247             nan     0.0010    0.0002
##    300        1.1134             nan     0.0010    0.0003
##    320        1.1028             nan     0.0010    0.0003
##    340        1.0928             nan     0.0010    0.0002
##    360        1.0826             nan     0.0010    0.0002
##    380        1.0726             nan     0.0010    0.0002
##    400        1.0630             nan     0.0010    0.0002
##    420        1.0537             nan     0.0010    0.0002
##    440        1.0447             nan     0.0010    0.0002
##    460        1.0358             nan     0.0010    0.0002
##    480        1.0271             nan     0.0010    0.0002
##    500        1.0187             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0003
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2853             nan     0.0010    0.0004
##     60        1.2684             nan     0.0010    0.0004
##     80        1.2524             nan     0.0010    0.0004
##    100        1.2365             nan     0.0010    0.0003
##    120        1.2206             nan     0.0010    0.0004
##    140        1.2057             nan     0.0010    0.0003
##    160        1.1912             nan     0.0010    0.0003
##    180        1.1770             nan     0.0010    0.0003
##    200        1.1635             nan     0.0010    0.0003
##    220        1.1503             nan     0.0010    0.0002
##    240        1.1372             nan     0.0010    0.0003
##    260        1.1252             nan     0.0010    0.0003
##    280        1.1132             nan     0.0010    0.0003
##    300        1.1014             nan     0.0010    0.0003
##    320        1.0900             nan     0.0010    0.0003
##    340        1.0788             nan     0.0010    0.0002
##    360        1.0680             nan     0.0010    0.0003
##    380        1.0575             nan     0.0010    0.0002
##    400        1.0474             nan     0.0010    0.0002
##    420        1.0375             nan     0.0010    0.0002
##    440        1.0279             nan     0.0010    0.0002
##    460        1.0186             nan     0.0010    0.0002
##    480        1.0094             nan     0.0010    0.0002
##    500        1.0005             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0005
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2681             nan     0.0010    0.0004
##     80        1.2520             nan     0.0010    0.0004
##    100        1.2364             nan     0.0010    0.0003
##    120        1.2212             nan     0.0010    0.0003
##    140        1.2062             nan     0.0010    0.0003
##    160        1.1917             nan     0.0010    0.0003
##    180        1.1778             nan     0.0010    0.0003
##    200        1.1647             nan     0.0010    0.0003
##    220        1.1515             nan     0.0010    0.0003
##    240        1.1390             nan     0.0010    0.0002
##    260        1.1266             nan     0.0010    0.0003
##    280        1.1147             nan     0.0010    0.0003
##    300        1.1030             nan     0.0010    0.0002
##    320        1.0915             nan     0.0010    0.0003
##    340        1.0804             nan     0.0010    0.0002
##    360        1.0699             nan     0.0010    0.0002
##    380        1.0593             nan     0.0010    0.0003
##    400        1.0491             nan     0.0010    0.0002
##    420        1.0395             nan     0.0010    0.0002
##    440        1.0299             nan     0.0010    0.0002
##    460        1.0207             nan     0.0010    0.0002
##    480        1.0115             nan     0.0010    0.0002
##    500        1.0025             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0003
##     40        1.2853             nan     0.0010    0.0004
##     60        1.2687             nan     0.0010    0.0003
##     80        1.2525             nan     0.0010    0.0003
##    100        1.2367             nan     0.0010    0.0004
##    120        1.2216             nan     0.0010    0.0003
##    140        1.2070             nan     0.0010    0.0003
##    160        1.1929             nan     0.0010    0.0003
##    180        1.1793             nan     0.0010    0.0003
##    200        1.1661             nan     0.0010    0.0003
##    220        1.1534             nan     0.0010    0.0002
##    240        1.1408             nan     0.0010    0.0003
##    260        1.1287             nan     0.0010    0.0003
##    280        1.1166             nan     0.0010    0.0003
##    300        1.1051             nan     0.0010    0.0002
##    320        1.0942             nan     0.0010    0.0002
##    340        1.0837             nan     0.0010    0.0002
##    360        1.0731             nan     0.0010    0.0002
##    380        1.0628             nan     0.0010    0.0002
##    400        1.0530             nan     0.0010    0.0002
##    420        1.0433             nan     0.0010    0.0002
##    440        1.0337             nan     0.0010    0.0002
##    460        1.0245             nan     0.0010    0.0002
##    480        1.0156             nan     0.0010    0.0002
##    500        1.0068             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0039
##      2        1.3029             nan     0.0100    0.0041
##      3        1.2953             nan     0.0100    0.0036
##      4        1.2883             nan     0.0100    0.0033
##      5        1.2803             nan     0.0100    0.0036
##      6        1.2729             nan     0.0100    0.0029
##      7        1.2657             nan     0.0100    0.0033
##      8        1.2583             nan     0.0100    0.0033
##      9        1.2512             nan     0.0100    0.0029
##     10        1.2440             nan     0.0100    0.0029
##     20        1.1785             nan     0.0100    0.0028
##     40        1.0757             nan     0.0100    0.0020
##     60        0.9971             nan     0.0100    0.0015
##     80        0.9334             nan     0.0100    0.0010
##    100        0.8817             nan     0.0100    0.0007
##    120        0.8408             nan     0.0100    0.0008
##    140        0.8063             nan     0.0100    0.0006
##    160        0.7765             nan     0.0100    0.0004
##    180        0.7520             nan     0.0100    0.0004
##    200        0.7292             nan     0.0100    0.0002
##    220        0.7093             nan     0.0100    0.0000
##    240        0.6929             nan     0.0100   -0.0000
##    260        0.6775             nan     0.0100    0.0000
##    280        0.6639             nan     0.0100    0.0001
##    300        0.6509             nan     0.0100    0.0000
##    320        0.6379             nan     0.0100   -0.0000
##    340        0.6261             nan     0.0100   -0.0001
##    360        0.6149             nan     0.0100    0.0001
##    380        0.6038             nan     0.0100   -0.0001
##    400        0.5937             nan     0.0100   -0.0000
##    420        0.5840             nan     0.0100   -0.0000
##    440        0.5742             nan     0.0100   -0.0000
##    460        0.5654             nan     0.0100    0.0000
##    480        0.5557             nan     0.0100    0.0000
##    500        0.5476             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0042
##      2        1.3034             nan     0.0100    0.0036
##      3        1.2958             nan     0.0100    0.0037
##      4        1.2885             nan     0.0100    0.0032
##      5        1.2804             nan     0.0100    0.0037
##      6        1.2735             nan     0.0100    0.0030
##      7        1.2665             nan     0.0100    0.0031
##      8        1.2592             nan     0.0100    0.0034
##      9        1.2522             nan     0.0100    0.0034
##     10        1.2449             nan     0.0100    0.0033
##     20        1.1796             nan     0.0100    0.0024
##     40        1.0769             nan     0.0100    0.0017
##     60        0.9979             nan     0.0100    0.0016
##     80        0.9347             nan     0.0100    0.0012
##    100        0.8849             nan     0.0100    0.0008
##    120        0.8436             nan     0.0100    0.0007
##    140        0.8105             nan     0.0100    0.0005
##    160        0.7808             nan     0.0100    0.0001
##    180        0.7553             nan     0.0100    0.0003
##    200        0.7333             nan     0.0100    0.0003
##    220        0.7139             nan     0.0100    0.0003
##    240        0.6972             nan     0.0100    0.0001
##    260        0.6817             nan     0.0100    0.0003
##    280        0.6681             nan     0.0100   -0.0001
##    300        0.6545             nan     0.0100    0.0000
##    320        0.6425             nan     0.0100    0.0000
##    340        0.6318             nan     0.0100    0.0000
##    360        0.6203             nan     0.0100   -0.0001
##    380        0.6090             nan     0.0100   -0.0000
##    400        0.5992             nan     0.0100    0.0001
##    420        0.5893             nan     0.0100    0.0000
##    440        0.5805             nan     0.0100   -0.0001
##    460        0.5713             nan     0.0100   -0.0001
##    480        0.5634             nan     0.0100   -0.0001
##    500        0.5548             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0041
##      2        1.3040             nan     0.0100    0.0039
##      3        1.2957             nan     0.0100    0.0039
##      4        1.2873             nan     0.0100    0.0032
##      5        1.2798             nan     0.0100    0.0035
##      6        1.2724             nan     0.0100    0.0032
##      7        1.2656             nan     0.0100    0.0034
##      8        1.2587             nan     0.0100    0.0034
##      9        1.2513             nan     0.0100    0.0031
##     10        1.2444             nan     0.0100    0.0030
##     20        1.1806             nan     0.0100    0.0027
##     40        1.0775             nan     0.0100    0.0020
##     60        0.9982             nan     0.0100    0.0013
##     80        0.9356             nan     0.0100    0.0012
##    100        0.8841             nan     0.0100    0.0010
##    120        0.8436             nan     0.0100    0.0007
##    140        0.8077             nan     0.0100    0.0004
##    160        0.7789             nan     0.0100    0.0003
##    180        0.7541             nan     0.0100    0.0003
##    200        0.7333             nan     0.0100    0.0002
##    220        0.7156             nan     0.0100    0.0000
##    240        0.6992             nan     0.0100    0.0001
##    260        0.6832             nan     0.0100    0.0003
##    280        0.6699             nan     0.0100    0.0001
##    300        0.6569             nan     0.0100    0.0000
##    320        0.6450             nan     0.0100    0.0002
##    340        0.6341             nan     0.0100   -0.0002
##    360        0.6224             nan     0.0100   -0.0000
##    380        0.6113             nan     0.0100   -0.0001
##    400        0.6020             nan     0.0100    0.0001
##    420        0.5932             nan     0.0100    0.0000
##    440        0.5839             nan     0.0100   -0.0001
##    460        0.5749             nan     0.0100   -0.0001
##    480        0.5663             nan     0.0100   -0.0002
##    500        0.5575             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0043
##      2        1.3028             nan     0.0100    0.0036
##      3        1.2942             nan     0.0100    0.0038
##      4        1.2858             nan     0.0100    0.0037
##      5        1.2776             nan     0.0100    0.0034
##      6        1.2698             nan     0.0100    0.0036
##      7        1.2629             nan     0.0100    0.0033
##      8        1.2551             nan     0.0100    0.0035
##      9        1.2477             nan     0.0100    0.0033
##     10        1.2405             nan     0.0100    0.0032
##     20        1.1697             nan     0.0100    0.0030
##     40        1.0612             nan     0.0100    0.0018
##     60        0.9762             nan     0.0100    0.0014
##     80        0.9086             nan     0.0100    0.0011
##    100        0.8536             nan     0.0100    0.0010
##    120        0.8079             nan     0.0100    0.0006
##    140        0.7707             nan     0.0100    0.0004
##    160        0.7392             nan     0.0100    0.0002
##    180        0.7122             nan     0.0100    0.0002
##    200        0.6895             nan     0.0100    0.0003
##    220        0.6680             nan     0.0100    0.0000
##    240        0.6484             nan     0.0100   -0.0001
##    260        0.6307             nan     0.0100    0.0001
##    280        0.6157             nan     0.0100    0.0000
##    300        0.6010             nan     0.0100    0.0002
##    320        0.5873             nan     0.0100    0.0001
##    340        0.5741             nan     0.0100    0.0000
##    360        0.5608             nan     0.0100   -0.0002
##    380        0.5494             nan     0.0100   -0.0000
##    400        0.5374             nan     0.0100   -0.0000
##    420        0.5269             nan     0.0100   -0.0001
##    440        0.5161             nan     0.0100    0.0000
##    460        0.5056             nan     0.0100    0.0000
##    480        0.4956             nan     0.0100   -0.0000
##    500        0.4873             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0039
##      2        1.3033             nan     0.0100    0.0038
##      3        1.2948             nan     0.0100    0.0038
##      4        1.2863             nan     0.0100    0.0037
##      5        1.2779             nan     0.0100    0.0038
##      6        1.2704             nan     0.0100    0.0034
##      7        1.2633             nan     0.0100    0.0032
##      8        1.2554             nan     0.0100    0.0035
##      9        1.2481             nan     0.0100    0.0033
##     10        1.2404             nan     0.0100    0.0032
##     20        1.1724             nan     0.0100    0.0029
##     40        1.0628             nan     0.0100    0.0019
##     60        0.9781             nan     0.0100    0.0014
##     80        0.9117             nan     0.0100    0.0011
##    100        0.8587             nan     0.0100    0.0007
##    120        0.8146             nan     0.0100    0.0008
##    140        0.7778             nan     0.0100    0.0006
##    160        0.7463             nan     0.0100    0.0004
##    180        0.7194             nan     0.0100    0.0004
##    200        0.6968             nan     0.0100    0.0002
##    220        0.6768             nan     0.0100    0.0002
##    240        0.6583             nan     0.0100    0.0001
##    260        0.6403             nan     0.0100    0.0001
##    280        0.6232             nan     0.0100    0.0003
##    300        0.6086             nan     0.0100   -0.0000
##    320        0.5963             nan     0.0100    0.0000
##    340        0.5832             nan     0.0100    0.0002
##    360        0.5714             nan     0.0100   -0.0001
##    380        0.5599             nan     0.0100   -0.0002
##    400        0.5482             nan     0.0100   -0.0001
##    420        0.5379             nan     0.0100   -0.0002
##    440        0.5275             nan     0.0100    0.0000
##    460        0.5172             nan     0.0100   -0.0001
##    480        0.5078             nan     0.0100    0.0001
##    500        0.4990             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0042
##      2        1.3028             nan     0.0100    0.0040
##      3        1.2942             nan     0.0100    0.0037
##      4        1.2859             nan     0.0100    0.0034
##      5        1.2779             nan     0.0100    0.0035
##      6        1.2702             nan     0.0100    0.0036
##      7        1.2630             nan     0.0100    0.0031
##      8        1.2552             nan     0.0100    0.0032
##      9        1.2483             nan     0.0100    0.0029
##     10        1.2406             nan     0.0100    0.0033
##     20        1.1722             nan     0.0100    0.0027
##     40        1.0626             nan     0.0100    0.0020
##     60        0.9787             nan     0.0100    0.0017
##     80        0.9126             nan     0.0100    0.0013
##    100        0.8621             nan     0.0100    0.0007
##    120        0.8183             nan     0.0100    0.0006
##    140        0.7826             nan     0.0100    0.0007
##    160        0.7523             nan     0.0100    0.0003
##    180        0.7268             nan     0.0100    0.0003
##    200        0.7035             nan     0.0100    0.0002
##    220        0.6835             nan     0.0100    0.0000
##    240        0.6646             nan     0.0100    0.0002
##    260        0.6475             nan     0.0100    0.0000
##    280        0.6319             nan     0.0100   -0.0001
##    300        0.6174             nan     0.0100   -0.0001
##    320        0.6048             nan     0.0100   -0.0001
##    340        0.5929             nan     0.0100   -0.0001
##    360        0.5812             nan     0.0100    0.0003
##    380        0.5705             nan     0.0100    0.0000
##    400        0.5599             nan     0.0100   -0.0000
##    420        0.5492             nan     0.0100    0.0001
##    440        0.5383             nan     0.0100    0.0000
##    460        0.5295             nan     0.0100    0.0001
##    480        0.5202             nan     0.0100   -0.0001
##    500        0.5107             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0038
##      2        1.3027             nan     0.0100    0.0042
##      3        1.2932             nan     0.0100    0.0043
##      4        1.2845             nan     0.0100    0.0040
##      5        1.2759             nan     0.0100    0.0038
##      6        1.2673             nan     0.0100    0.0037
##      7        1.2586             nan     0.0100    0.0038
##      8        1.2501             nan     0.0100    0.0036
##      9        1.2418             nan     0.0100    0.0036
##     10        1.2341             nan     0.0100    0.0037
##     20        1.1612             nan     0.0100    0.0031
##     40        1.0443             nan     0.0100    0.0023
##     60        0.9556             nan     0.0100    0.0017
##     80        0.8873             nan     0.0100    0.0010
##    100        0.8324             nan     0.0100    0.0011
##    120        0.7860             nan     0.0100    0.0007
##    140        0.7480             nan     0.0100    0.0005
##    160        0.7145             nan     0.0100    0.0004
##    180        0.6846             nan     0.0100    0.0002
##    200        0.6589             nan     0.0100    0.0003
##    220        0.6351             nan     0.0100    0.0003
##    240        0.6151             nan     0.0100    0.0002
##    260        0.5969             nan     0.0100    0.0001
##    280        0.5795             nan     0.0100    0.0000
##    300        0.5631             nan     0.0100    0.0001
##    320        0.5477             nan     0.0100   -0.0000
##    340        0.5329             nan     0.0100    0.0000
##    360        0.5191             nan     0.0100   -0.0000
##    380        0.5071             nan     0.0100    0.0000
##    400        0.4956             nan     0.0100    0.0001
##    420        0.4847             nan     0.0100   -0.0001
##    440        0.4732             nan     0.0100   -0.0001
##    460        0.4626             nan     0.0100    0.0000
##    480        0.4518             nan     0.0100   -0.0002
##    500        0.4422             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0043
##      2        1.3023             nan     0.0100    0.0041
##      3        1.2929             nan     0.0100    0.0043
##      4        1.2841             nan     0.0100    0.0034
##      5        1.2757             nan     0.0100    0.0038
##      6        1.2676             nan     0.0100    0.0038
##      7        1.2597             nan     0.0100    0.0033
##      8        1.2514             nan     0.0100    0.0038
##      9        1.2436             nan     0.0100    0.0035
##     10        1.2356             nan     0.0100    0.0034
##     20        1.1646             nan     0.0100    0.0030
##     40        1.0480             nan     0.0100    0.0021
##     60        0.9604             nan     0.0100    0.0015
##     80        0.8912             nan     0.0100    0.0012
##    100        0.8362             nan     0.0100    0.0008
##    120        0.7907             nan     0.0100    0.0005
##    140        0.7519             nan     0.0100    0.0006
##    160        0.7191             nan     0.0100    0.0004
##    180        0.6922             nan     0.0100    0.0001
##    200        0.6664             nan     0.0100   -0.0001
##    220        0.6443             nan     0.0100    0.0001
##    240        0.6235             nan     0.0100    0.0001
##    260        0.6051             nan     0.0100    0.0001
##    280        0.5879             nan     0.0100    0.0000
##    300        0.5714             nan     0.0100   -0.0000
##    320        0.5562             nan     0.0100   -0.0001
##    340        0.5425             nan     0.0100   -0.0001
##    360        0.5298             nan     0.0100    0.0000
##    380        0.5166             nan     0.0100   -0.0000
##    400        0.5042             nan     0.0100   -0.0001
##    420        0.4932             nan     0.0100    0.0000
##    440        0.4828             nan     0.0100    0.0001
##    460        0.4731             nan     0.0100   -0.0000
##    480        0.4624             nan     0.0100   -0.0001
##    500        0.4519             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0041
##      2        1.3017             nan     0.0100    0.0040
##      3        1.2933             nan     0.0100    0.0039
##      4        1.2839             nan     0.0100    0.0039
##      5        1.2757             nan     0.0100    0.0040
##      6        1.2678             nan     0.0100    0.0034
##      7        1.2599             nan     0.0100    0.0039
##      8        1.2523             nan     0.0100    0.0032
##      9        1.2448             nan     0.0100    0.0034
##     10        1.2370             nan     0.0100    0.0040
##     20        1.1685             nan     0.0100    0.0027
##     40        1.0564             nan     0.0100    0.0023
##     60        0.9678             nan     0.0100    0.0015
##     80        0.8995             nan     0.0100    0.0012
##    100        0.8458             nan     0.0100    0.0012
##    120        0.8007             nan     0.0100    0.0008
##    140        0.7640             nan     0.0100    0.0005
##    160        0.7310             nan     0.0100    0.0004
##    180        0.7031             nan     0.0100    0.0003
##    200        0.6805             nan     0.0100    0.0002
##    220        0.6585             nan     0.0100    0.0001
##    240        0.6390             nan     0.0100    0.0001
##    260        0.6213             nan     0.0100    0.0001
##    280        0.6047             nan     0.0100    0.0001
##    300        0.5876             nan     0.0100    0.0001
##    320        0.5731             nan     0.0100    0.0000
##    340        0.5600             nan     0.0100   -0.0000
##    360        0.5466             nan     0.0100   -0.0001
##    380        0.5336             nan     0.0100    0.0002
##    400        0.5220             nan     0.0100   -0.0000
##    420        0.5110             nan     0.0100   -0.0001
##    440        0.4994             nan     0.0100   -0.0001
##    460        0.4880             nan     0.0100    0.0000
##    480        0.4783             nan     0.0100   -0.0001
##    500        0.4686             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2358             nan     0.1000    0.0361
##      2        1.1722             nan     0.1000    0.0271
##      3        1.1241             nan     0.1000    0.0208
##      4        1.0768             nan     0.1000    0.0178
##      5        1.0345             nan     0.1000    0.0170
##      6        0.9970             nan     0.1000    0.0151
##      7        0.9635             nan     0.1000    0.0134
##      8        0.9322             nan     0.1000    0.0127
##      9        0.9038             nan     0.1000    0.0103
##     10        0.8834             nan     0.1000    0.0058
##     20        0.7239             nan     0.1000    0.0003
##     40        0.5859             nan     0.1000    0.0006
##     60        0.5066             nan     0.1000   -0.0014
##     80        0.4385             nan     0.1000   -0.0016
##    100        0.3846             nan     0.1000   -0.0004
##    120        0.3490             nan     0.1000   -0.0008
##    140        0.3103             nan     0.1000   -0.0014
##    160        0.2814             nan     0.1000   -0.0001
##    180        0.2532             nan     0.1000   -0.0006
##    200        0.2285             nan     0.1000    0.0002
##    220        0.2057             nan     0.1000   -0.0007
##    240        0.1845             nan     0.1000   -0.0004
##    260        0.1663             nan     0.1000   -0.0005
##    280        0.1522             nan     0.1000   -0.0001
##    300        0.1384             nan     0.1000   -0.0004
##    320        0.1258             nan     0.1000   -0.0001
##    340        0.1145             nan     0.1000    0.0001
##    360        0.1043             nan     0.1000    0.0001
##    380        0.0962             nan     0.1000   -0.0001
##    400        0.0882             nan     0.1000   -0.0000
##    420        0.0813             nan     0.1000   -0.0003
##    440        0.0752             nan     0.1000   -0.0001
##    460        0.0695             nan     0.1000   -0.0002
##    480        0.0639             nan     0.1000   -0.0002
##    500        0.0592             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2392             nan     0.1000    0.0368
##      2        1.1800             nan     0.1000    0.0257
##      3        1.1249             nan     0.1000    0.0253
##      4        1.0707             nan     0.1000    0.0227
##      5        1.0281             nan     0.1000    0.0169
##      6        0.9956             nan     0.1000    0.0148
##      7        0.9642             nan     0.1000    0.0109
##      8        0.9372             nan     0.1000    0.0108
##      9        0.9085             nan     0.1000    0.0116
##     10        0.8823             nan     0.1000    0.0095
##     20        0.7362             nan     0.1000    0.0015
##     40        0.6135             nan     0.1000   -0.0018
##     60        0.5279             nan     0.1000   -0.0005
##     80        0.4641             nan     0.1000    0.0003
##    100        0.4138             nan     0.1000   -0.0011
##    120        0.3682             nan     0.1000   -0.0022
##    140        0.3332             nan     0.1000   -0.0009
##    160        0.2992             nan     0.1000   -0.0010
##    180        0.2715             nan     0.1000   -0.0006
##    200        0.2462             nan     0.1000   -0.0003
##    220        0.2246             nan     0.1000   -0.0002
##    240        0.2051             nan     0.1000   -0.0006
##    260        0.1863             nan     0.1000   -0.0003
##    280        0.1709             nan     0.1000   -0.0005
##    300        0.1551             nan     0.1000   -0.0004
##    320        0.1436             nan     0.1000   -0.0006
##    340        0.1327             nan     0.1000   -0.0003
##    360        0.1214             nan     0.1000   -0.0006
##    380        0.1117             nan     0.1000   -0.0000
##    400        0.1024             nan     0.1000   -0.0003
##    420        0.0961             nan     0.1000   -0.0001
##    440        0.0881             nan     0.1000   -0.0001
##    460        0.0819             nan     0.1000   -0.0004
##    480        0.0760             nan     0.1000   -0.0001
##    500        0.0705             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2413             nan     0.1000    0.0360
##      2        1.1728             nan     0.1000    0.0297
##      3        1.1214             nan     0.1000    0.0216
##      4        1.0726             nan     0.1000    0.0212
##      5        1.0274             nan     0.1000    0.0192
##      6        0.9897             nan     0.1000    0.0161
##      7        0.9557             nan     0.1000    0.0147
##      8        0.9329             nan     0.1000    0.0087
##      9        0.9073             nan     0.1000    0.0091
##     10        0.8776             nan     0.1000    0.0128
##     20        0.7336             nan     0.1000    0.0023
##     40        0.6027             nan     0.1000    0.0001
##     60        0.5247             nan     0.1000   -0.0004
##     80        0.4603             nan     0.1000    0.0002
##    100        0.4095             nan     0.1000    0.0000
##    120        0.3634             nan     0.1000    0.0003
##    140        0.3272             nan     0.1000   -0.0009
##    160        0.2945             nan     0.1000   -0.0005
##    180        0.2709             nan     0.1000   -0.0009
##    200        0.2481             nan     0.1000   -0.0005
##    220        0.2256             nan     0.1000   -0.0006
##    240        0.2067             nan     0.1000   -0.0009
##    260        0.1874             nan     0.1000   -0.0001
##    280        0.1726             nan     0.1000   -0.0006
##    300        0.1604             nan     0.1000   -0.0007
##    320        0.1484             nan     0.1000   -0.0006
##    340        0.1383             nan     0.1000   -0.0002
##    360        0.1287             nan     0.1000   -0.0005
##    380        0.1189             nan     0.1000   -0.0003
##    400        0.1099             nan     0.1000   -0.0003
##    420        0.1003             nan     0.1000   -0.0003
##    440        0.0948             nan     0.1000   -0.0005
##    460        0.0884             nan     0.1000   -0.0003
##    480        0.0818             nan     0.1000   -0.0003
##    500        0.0767             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2407             nan     0.1000    0.0394
##      2        1.1664             nan     0.1000    0.0345
##      3        1.1055             nan     0.1000    0.0273
##      4        1.0577             nan     0.1000    0.0185
##      5        1.0167             nan     0.1000    0.0171
##      6        0.9801             nan     0.1000    0.0154
##      7        0.9461             nan     0.1000    0.0127
##      8        0.9154             nan     0.1000    0.0111
##      9        0.8882             nan     0.1000    0.0113
##     10        0.8582             nan     0.1000    0.0114
##     20        0.6922             nan     0.1000    0.0017
##     40        0.5419             nan     0.1000    0.0004
##     60        0.4596             nan     0.1000   -0.0013
##     80        0.3912             nan     0.1000    0.0003
##    100        0.3396             nan     0.1000   -0.0004
##    120        0.2887             nan     0.1000   -0.0007
##    140        0.2524             nan     0.1000   -0.0001
##    160        0.2200             nan     0.1000   -0.0008
##    180        0.1915             nan     0.1000    0.0000
##    200        0.1711             nan     0.1000   -0.0005
##    220        0.1515             nan     0.1000    0.0001
##    240        0.1326             nan     0.1000   -0.0001
##    260        0.1192             nan     0.1000   -0.0003
##    280        0.1069             nan     0.1000   -0.0003
##    300        0.0957             nan     0.1000   -0.0002
##    320        0.0838             nan     0.1000    0.0000
##    340        0.0760             nan     0.1000   -0.0000
##    360        0.0682             nan     0.1000   -0.0001
##    380        0.0622             nan     0.1000   -0.0001
##    400        0.0561             nan     0.1000   -0.0002
##    420        0.0504             nan     0.1000   -0.0001
##    440        0.0456             nan     0.1000   -0.0001
##    460        0.0413             nan     0.1000   -0.0001
##    480        0.0371             nan     0.1000   -0.0001
##    500        0.0338             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2365             nan     0.1000    0.0357
##      2        1.1712             nan     0.1000    0.0289
##      3        1.1099             nan     0.1000    0.0272
##      4        1.0565             nan     0.1000    0.0207
##      5        1.0106             nan     0.1000    0.0189
##      6        0.9661             nan     0.1000    0.0191
##      7        0.9332             nan     0.1000    0.0117
##      8        0.9000             nan     0.1000    0.0116
##      9        0.8742             nan     0.1000    0.0095
##     10        0.8471             nan     0.1000    0.0103
##     20        0.6991             nan     0.1000    0.0026
##     40        0.5613             nan     0.1000   -0.0014
##     60        0.4725             nan     0.1000   -0.0013
##     80        0.4056             nan     0.1000    0.0002
##    100        0.3510             nan     0.1000   -0.0005
##    120        0.3024             nan     0.1000   -0.0010
##    140        0.2663             nan     0.1000   -0.0007
##    160        0.2359             nan     0.1000   -0.0007
##    180        0.2099             nan     0.1000   -0.0009
##    200        0.1861             nan     0.1000   -0.0005
##    220        0.1633             nan     0.1000    0.0001
##    240        0.1462             nan     0.1000   -0.0003
##    260        0.1301             nan     0.1000   -0.0004
##    280        0.1162             nan     0.1000   -0.0007
##    300        0.1056             nan     0.1000   -0.0004
##    320        0.0952             nan     0.1000   -0.0003
##    340        0.0859             nan     0.1000   -0.0005
##    360        0.0771             nan     0.1000   -0.0003
##    380        0.0694             nan     0.1000   -0.0002
##    400        0.0619             nan     0.1000   -0.0001
##    420        0.0557             nan     0.1000   -0.0002
##    440        0.0505             nan     0.1000   -0.0002
##    460        0.0457             nan     0.1000   -0.0002
##    480        0.0410             nan     0.1000   -0.0000
##    500        0.0370             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2387             nan     0.1000    0.0383
##      2        1.1700             nan     0.1000    0.0316
##      3        1.1112             nan     0.1000    0.0259
##      4        1.0615             nan     0.1000    0.0199
##      5        1.0132             nan     0.1000    0.0221
##      6        0.9741             nan     0.1000    0.0170
##      7        0.9389             nan     0.1000    0.0150
##      8        0.9125             nan     0.1000    0.0084
##      9        0.8851             nan     0.1000    0.0100
##     10        0.8644             nan     0.1000    0.0048
##     20        0.7091             nan     0.1000    0.0011
##     40        0.5621             nan     0.1000   -0.0020
##     60        0.4661             nan     0.1000    0.0008
##     80        0.3954             nan     0.1000    0.0000
##    100        0.3429             nan     0.1000   -0.0011
##    120        0.2972             nan     0.1000   -0.0009
##    140        0.2613             nan     0.1000   -0.0020
##    160        0.2318             nan     0.1000   -0.0005
##    180        0.2075             nan     0.1000   -0.0008
##    200        0.1850             nan     0.1000   -0.0008
##    220        0.1646             nan     0.1000   -0.0005
##    240        0.1495             nan     0.1000   -0.0010
##    260        0.1349             nan     0.1000   -0.0004
##    280        0.1215             nan     0.1000   -0.0004
##    300        0.1097             nan     0.1000   -0.0006
##    320        0.0981             nan     0.1000   -0.0003
##    340        0.0883             nan     0.1000   -0.0002
##    360        0.0795             nan     0.1000   -0.0002
##    380        0.0728             nan     0.1000   -0.0005
##    400        0.0657             nan     0.1000   -0.0001
##    420        0.0597             nan     0.1000   -0.0002
##    440        0.0540             nan     0.1000   -0.0001
##    460        0.0492             nan     0.1000   -0.0002
##    480        0.0444             nan     0.1000   -0.0001
##    500        0.0409             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2343             nan     0.1000    0.0362
##      2        1.1576             nan     0.1000    0.0301
##      3        1.0929             nan     0.1000    0.0277
##      4        1.0390             nan     0.1000    0.0227
##      5        0.9953             nan     0.1000    0.0196
##      6        0.9556             nan     0.1000    0.0172
##      7        0.9171             nan     0.1000    0.0139
##      8        0.8886             nan     0.1000    0.0111
##      9        0.8605             nan     0.1000    0.0094
##     10        0.8324             nan     0.1000    0.0106
##     20        0.6656             nan     0.1000    0.0025
##     40        0.5056             nan     0.1000   -0.0002
##     60        0.4084             nan     0.1000   -0.0001
##     80        0.3370             nan     0.1000   -0.0004
##    100        0.2806             nan     0.1000   -0.0001
##    120        0.2350             nan     0.1000   -0.0003
##    140        0.1994             nan     0.1000   -0.0005
##    160        0.1704             nan     0.1000   -0.0000
##    180        0.1460             nan     0.1000    0.0002
##    200        0.1282             nan     0.1000   -0.0004
##    220        0.1110             nan     0.1000   -0.0002
##    240        0.0973             nan     0.1000   -0.0003
##    260        0.0847             nan     0.1000   -0.0003
##    280        0.0741             nan     0.1000   -0.0001
##    300        0.0637             nan     0.1000   -0.0000
##    320        0.0559             nan     0.1000   -0.0001
##    340        0.0486             nan     0.1000   -0.0000
##    360        0.0433             nan     0.1000   -0.0002
##    380        0.0382             nan     0.1000   -0.0001
##    400        0.0339             nan     0.1000   -0.0001
##    420        0.0299             nan     0.1000   -0.0001
##    440        0.0269             nan     0.1000   -0.0000
##    460        0.0241             nan     0.1000   -0.0001
##    480        0.0215             nan     0.1000   -0.0000
##    500        0.0192             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2318             nan     0.1000    0.0412
##      2        1.1610             nan     0.1000    0.0307
##      3        1.0957             nan     0.1000    0.0271
##      4        1.0462             nan     0.1000    0.0202
##      5        0.9971             nan     0.1000    0.0196
##      6        0.9606             nan     0.1000    0.0137
##      7        0.9233             nan     0.1000    0.0135
##      8        0.8942             nan     0.1000    0.0111
##      9        0.8654             nan     0.1000    0.0119
##     10        0.8413             nan     0.1000    0.0083
##     20        0.6708             nan     0.1000    0.0020
##     40        0.5172             nan     0.1000   -0.0002
##     60        0.4258             nan     0.1000   -0.0000
##     80        0.3509             nan     0.1000   -0.0003
##    100        0.2907             nan     0.1000   -0.0002
##    120        0.2433             nan     0.1000   -0.0013
##    140        0.2096             nan     0.1000   -0.0006
##    160        0.1790             nan     0.1000   -0.0012
##    180        0.1568             nan     0.1000   -0.0005
##    200        0.1376             nan     0.1000   -0.0001
##    220        0.1209             nan     0.1000   -0.0004
##    240        0.1059             nan     0.1000   -0.0004
##    260        0.0917             nan     0.1000   -0.0002
##    280        0.0799             nan     0.1000   -0.0001
##    300        0.0705             nan     0.1000   -0.0003
##    320        0.0613             nan     0.1000   -0.0003
##    340        0.0538             nan     0.1000   -0.0001
##    360        0.0478             nan     0.1000   -0.0001
##    380        0.0425             nan     0.1000   -0.0002
##    400        0.0379             nan     0.1000   -0.0001
##    420        0.0331             nan     0.1000   -0.0001
##    440        0.0292             nan     0.1000   -0.0001
##    460        0.0266             nan     0.1000   -0.0001
##    480        0.0235             nan     0.1000   -0.0001
##    500        0.0209             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2365             nan     0.1000    0.0381
##      2        1.1645             nan     0.1000    0.0325
##      3        1.1032             nan     0.1000    0.0220
##      4        1.0445             nan     0.1000    0.0248
##      5        1.0016             nan     0.1000    0.0186
##      6        0.9501             nan     0.1000    0.0209
##      7        0.9195             nan     0.1000    0.0121
##      8        0.8871             nan     0.1000    0.0109
##      9        0.8610             nan     0.1000    0.0103
##     10        0.8342             nan     0.1000    0.0102
##     20        0.6795             nan     0.1000    0.0027
##     40        0.5304             nan     0.1000   -0.0004
##     60        0.4416             nan     0.1000   -0.0012
##     80        0.3697             nan     0.1000   -0.0010
##    100        0.3057             nan     0.1000   -0.0005
##    120        0.2625             nan     0.1000   -0.0008
##    140        0.2282             nan     0.1000   -0.0007
##    160        0.1951             nan     0.1000   -0.0003
##    180        0.1697             nan     0.1000   -0.0004
##    200        0.1495             nan     0.1000   -0.0003
##    220        0.1312             nan     0.1000   -0.0007
##    240        0.1148             nan     0.1000   -0.0006
##    260        0.0999             nan     0.1000   -0.0006
##    280        0.0883             nan     0.1000   -0.0004
##    300        0.0789             nan     0.1000   -0.0003
##    320        0.0693             nan     0.1000   -0.0005
##    340        0.0612             nan     0.1000   -0.0002
##    360        0.0541             nan     0.1000   -0.0001
##    380        0.0484             nan     0.1000   -0.0003
##    400        0.0433             nan     0.1000   -0.0002
##    420        0.0382             nan     0.1000   -0.0001
##    440        0.0337             nan     0.1000   -0.0000
##    460        0.0304             nan     0.1000   -0.0001
##    480        0.0272             nan     0.1000   -0.0002
##    500        0.0241             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0005
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3034             nan     0.0010    0.0004
##     40        1.2868             nan     0.0010    0.0004
##     60        1.2707             nan     0.0010    0.0004
##     80        1.2552             nan     0.0010    0.0004
##    100        1.2396             nan     0.0010    0.0003
##    120        1.2251             nan     0.0010    0.0003
##    140        1.2113             nan     0.0010    0.0003
##    160        1.1977             nan     0.0010    0.0003
##    180        1.1847             nan     0.0010    0.0003
##    200        1.1719             nan     0.0010    0.0003
##    220        1.1595             nan     0.0010    0.0002
##    240        1.1476             nan     0.0010    0.0003
##    260        1.1360             nan     0.0010    0.0003
##    280        1.1243             nan     0.0010    0.0003
##    300        1.1133             nan     0.0010    0.0002
##    320        1.1026             nan     0.0010    0.0002
##    340        1.0921             nan     0.0010    0.0002
##    360        1.0818             nan     0.0010    0.0002
##    380        1.0721             nan     0.0010    0.0002
##    400        1.0625             nan     0.0010    0.0002
##    420        1.0531             nan     0.0010    0.0002
##    440        1.0439             nan     0.0010    0.0002
##    460        1.0350             nan     0.0010    0.0002
##    480        1.0265             nan     0.0010    0.0002
##    500        1.0183             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0003
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0003
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0003
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2872             nan     0.0010    0.0003
##     60        1.2712             nan     0.0010    0.0003
##     80        1.2554             nan     0.0010    0.0004
##    100        1.2404             nan     0.0010    0.0003
##    120        1.2262             nan     0.0010    0.0003
##    140        1.2123             nan     0.0010    0.0003
##    160        1.1989             nan     0.0010    0.0003
##    180        1.1856             nan     0.0010    0.0003
##    200        1.1727             nan     0.0010    0.0003
##    220        1.1602             nan     0.0010    0.0003
##    240        1.1480             nan     0.0010    0.0003
##    260        1.1363             nan     0.0010    0.0003
##    280        1.1247             nan     0.0010    0.0003
##    300        1.1137             nan     0.0010    0.0002
##    320        1.1028             nan     0.0010    0.0003
##    340        1.0926             nan     0.0010    0.0002
##    360        1.0824             nan     0.0010    0.0002
##    380        1.0726             nan     0.0010    0.0002
##    400        1.0628             nan     0.0010    0.0002
##    420        1.0534             nan     0.0010    0.0002
##    440        1.0443             nan     0.0010    0.0002
##    460        1.0353             nan     0.0010    0.0002
##    480        1.0269             nan     0.0010    0.0002
##    500        1.0183             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0003
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3039             nan     0.0010    0.0004
##     40        1.2877             nan     0.0010    0.0004
##     60        1.2721             nan     0.0010    0.0003
##     80        1.2570             nan     0.0010    0.0003
##    100        1.2422             nan     0.0010    0.0003
##    120        1.2279             nan     0.0010    0.0003
##    140        1.2138             nan     0.0010    0.0003
##    160        1.2007             nan     0.0010    0.0003
##    180        1.1876             nan     0.0010    0.0003
##    200        1.1749             nan     0.0010    0.0002
##    220        1.1625             nan     0.0010    0.0002
##    240        1.1508             nan     0.0010    0.0003
##    260        1.1394             nan     0.0010    0.0003
##    280        1.1283             nan     0.0010    0.0002
##    300        1.1176             nan     0.0010    0.0002
##    320        1.1067             nan     0.0010    0.0002
##    340        1.0965             nan     0.0010    0.0002
##    360        1.0863             nan     0.0010    0.0002
##    380        1.0766             nan     0.0010    0.0002
##    400        1.0669             nan     0.0010    0.0002
##    420        1.0576             nan     0.0010    0.0002
##    440        1.0486             nan     0.0010    0.0002
##    460        1.0398             nan     0.0010    0.0002
##    480        1.0311             nan     0.0010    0.0002
##    500        1.0228             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0005
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0005
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2850             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2507             nan     0.0010    0.0004
##    100        1.2348             nan     0.0010    0.0004
##    120        1.2191             nan     0.0010    0.0003
##    140        1.2041             nan     0.0010    0.0004
##    160        1.1895             nan     0.0010    0.0003
##    180        1.1754             nan     0.0010    0.0003
##    200        1.1620             nan     0.0010    0.0003
##    220        1.1492             nan     0.0010    0.0003
##    240        1.1366             nan     0.0010    0.0003
##    260        1.1243             nan     0.0010    0.0003
##    280        1.1123             nan     0.0010    0.0003
##    300        1.1008             nan     0.0010    0.0002
##    320        1.0896             nan     0.0010    0.0002
##    340        1.0785             nan     0.0010    0.0002
##    360        1.0680             nan     0.0010    0.0002
##    380        1.0576             nan     0.0010    0.0002
##    400        1.0476             nan     0.0010    0.0002
##    420        1.0376             nan     0.0010    0.0002
##    440        1.0281             nan     0.0010    0.0002
##    460        1.0190             nan     0.0010    0.0002
##    480        1.0100             nan     0.0010    0.0002
##    500        1.0012             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0003
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2854             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0004
##     80        1.2515             nan     0.0010    0.0004
##    100        1.2357             nan     0.0010    0.0003
##    120        1.2206             nan     0.0010    0.0004
##    140        1.2058             nan     0.0010    0.0003
##    160        1.1917             nan     0.0010    0.0003
##    180        1.1776             nan     0.0010    0.0003
##    200        1.1644             nan     0.0010    0.0003
##    220        1.1514             nan     0.0010    0.0003
##    240        1.1386             nan     0.0010    0.0003
##    260        1.1265             nan     0.0010    0.0003
##    280        1.1146             nan     0.0010    0.0002
##    300        1.1031             nan     0.0010    0.0003
##    320        1.0918             nan     0.0010    0.0003
##    340        1.0808             nan     0.0010    0.0002
##    360        1.0702             nan     0.0010    0.0002
##    380        1.0599             nan     0.0010    0.0002
##    400        1.0500             nan     0.0010    0.0002
##    420        1.0400             nan     0.0010    0.0002
##    440        1.0304             nan     0.0010    0.0002
##    460        1.0212             nan     0.0010    0.0002
##    480        1.0123             nan     0.0010    0.0002
##    500        1.0036             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0005
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2858             nan     0.0010    0.0004
##     60        1.2693             nan     0.0010    0.0004
##     80        1.2530             nan     0.0010    0.0004
##    100        1.2375             nan     0.0010    0.0003
##    120        1.2223             nan     0.0010    0.0003
##    140        1.2082             nan     0.0010    0.0003
##    160        1.1944             nan     0.0010    0.0003
##    180        1.1806             nan     0.0010    0.0003
##    200        1.1675             nan     0.0010    0.0003
##    220        1.1543             nan     0.0010    0.0003
##    240        1.1417             nan     0.0010    0.0002
##    260        1.1295             nan     0.0010    0.0003
##    280        1.1178             nan     0.0010    0.0003
##    300        1.1061             nan     0.0010    0.0002
##    320        1.0951             nan     0.0010    0.0003
##    340        1.0841             nan     0.0010    0.0003
##    360        1.0735             nan     0.0010    0.0002
##    380        1.0633             nan     0.0010    0.0002
##    400        1.0533             nan     0.0010    0.0002
##    420        1.0436             nan     0.0010    0.0002
##    440        1.0343             nan     0.0010    0.0002
##    460        1.0251             nan     0.0010    0.0002
##    480        1.0161             nan     0.0010    0.0002
##    500        1.0072             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0005
##      5        1.3156             nan     0.0010    0.0004
##      6        1.3147             nan     0.0010    0.0004
##      7        1.3137             nan     0.0010    0.0004
##      8        1.3127             nan     0.0010    0.0004
##      9        1.3118             nan     0.0010    0.0005
##     10        1.3108             nan     0.0010    0.0004
##     20        1.3015             nan     0.0010    0.0004
##     40        1.2833             nan     0.0010    0.0004
##     60        1.2658             nan     0.0010    0.0004
##     80        1.2489             nan     0.0010    0.0004
##    100        1.2325             nan     0.0010    0.0004
##    120        1.2164             nan     0.0010    0.0004
##    140        1.2009             nan     0.0010    0.0003
##    160        1.1856             nan     0.0010    0.0003
##    180        1.1707             nan     0.0010    0.0003
##    200        1.1566             nan     0.0010    0.0003
##    220        1.1428             nan     0.0010    0.0003
##    240        1.1296             nan     0.0010    0.0003
##    260        1.1167             nan     0.0010    0.0003
##    280        1.1040             nan     0.0010    0.0003
##    300        1.0919             nan     0.0010    0.0002
##    320        1.0801             nan     0.0010    0.0003
##    340        1.0687             nan     0.0010    0.0002
##    360        1.0576             nan     0.0010    0.0002
##    380        1.0465             nan     0.0010    0.0002
##    400        1.0360             nan     0.0010    0.0002
##    420        1.0259             nan     0.0010    0.0002
##    440        1.0160             nan     0.0010    0.0002
##    460        1.0066             nan     0.0010    0.0002
##    480        0.9971             nan     0.0010    0.0002
##    500        0.9881             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3195             nan     0.0010    0.0005
##      2        1.3185             nan     0.0010    0.0005
##      3        1.3176             nan     0.0010    0.0004
##      4        1.3166             nan     0.0010    0.0004
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3138             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3109             nan     0.0010    0.0004
##     20        1.3013             nan     0.0010    0.0005
##     40        1.2832             nan     0.0010    0.0004
##     60        1.2654             nan     0.0010    0.0004
##     80        1.2487             nan     0.0010    0.0004
##    100        1.2323             nan     0.0010    0.0003
##    120        1.2163             nan     0.0010    0.0003
##    140        1.2013             nan     0.0010    0.0003
##    160        1.1870             nan     0.0010    0.0003
##    180        1.1726             nan     0.0010    0.0003
##    200        1.1587             nan     0.0010    0.0003
##    220        1.1453             nan     0.0010    0.0003
##    240        1.1322             nan     0.0010    0.0003
##    260        1.1194             nan     0.0010    0.0003
##    280        1.1071             nan     0.0010    0.0003
##    300        1.0951             nan     0.0010    0.0002
##    320        1.0834             nan     0.0010    0.0003
##    340        1.0722             nan     0.0010    0.0002
##    360        1.0610             nan     0.0010    0.0002
##    380        1.0502             nan     0.0010    0.0002
##    400        1.0399             nan     0.0010    0.0002
##    420        1.0295             nan     0.0010    0.0003
##    440        1.0196             nan     0.0010    0.0002
##    460        1.0100             nan     0.0010    0.0002
##    480        1.0006             nan     0.0010    0.0002
##    500        0.9917             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0005
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2841             nan     0.0010    0.0004
##     60        1.2669             nan     0.0010    0.0004
##     80        1.2504             nan     0.0010    0.0004
##    100        1.2341             nan     0.0010    0.0004
##    120        1.2187             nan     0.0010    0.0003
##    140        1.2034             nan     0.0010    0.0003
##    160        1.1886             nan     0.0010    0.0003
##    180        1.1747             nan     0.0010    0.0003
##    200        1.1608             nan     0.0010    0.0003
##    220        1.1476             nan     0.0010    0.0003
##    240        1.1349             nan     0.0010    0.0003
##    260        1.1224             nan     0.0010    0.0003
##    280        1.1101             nan     0.0010    0.0003
##    300        1.0984             nan     0.0010    0.0003
##    320        1.0867             nan     0.0010    0.0002
##    340        1.0755             nan     0.0010    0.0003
##    360        1.0647             nan     0.0010    0.0003
##    380        1.0542             nan     0.0010    0.0002
##    400        1.0439             nan     0.0010    0.0002
##    420        1.0338             nan     0.0010    0.0002
##    440        1.0240             nan     0.0010    0.0002
##    460        1.0145             nan     0.0010    0.0002
##    480        1.0052             nan     0.0010    0.0002
##    500        0.9961             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0041
##      2        1.3025             nan     0.0100    0.0040
##      3        1.2933             nan     0.0100    0.0040
##      4        1.2845             nan     0.0100    0.0039
##      5        1.2767             nan     0.0100    0.0038
##      6        1.2686             nan     0.0100    0.0041
##      7        1.2606             nan     0.0100    0.0036
##      8        1.2530             nan     0.0100    0.0033
##      9        1.2450             nan     0.0100    0.0037
##     10        1.2376             nan     0.0100    0.0034
##     20        1.1697             nan     0.0100    0.0032
##     40        1.0623             nan     0.0100    0.0018
##     60        0.9793             nan     0.0100    0.0017
##     80        0.9151             nan     0.0100    0.0012
##    100        0.8631             nan     0.0100    0.0008
##    120        0.8212             nan     0.0100    0.0007
##    140        0.7862             nan     0.0100    0.0008
##    160        0.7559             nan     0.0100    0.0005
##    180        0.7292             nan     0.0100    0.0004
##    200        0.7067             nan     0.0100    0.0002
##    220        0.6865             nan     0.0100    0.0002
##    240        0.6690             nan     0.0100    0.0002
##    260        0.6523             nan     0.0100    0.0001
##    280        0.6387             nan     0.0100    0.0001
##    300        0.6249             nan     0.0100    0.0002
##    320        0.6120             nan     0.0100    0.0000
##    340        0.5996             nan     0.0100    0.0001
##    360        0.5880             nan     0.0100    0.0001
##    380        0.5765             nan     0.0100    0.0001
##    400        0.5665             nan     0.0100   -0.0001
##    420        0.5572             nan     0.0100   -0.0001
##    440        0.5484             nan     0.0100    0.0001
##    460        0.5386             nan     0.0100    0.0000
##    480        0.5296             nan     0.0100   -0.0001
##    500        0.5213             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3125             nan     0.0100    0.0037
##      2        1.3037             nan     0.0100    0.0043
##      3        1.2956             nan     0.0100    0.0035
##      4        1.2872             nan     0.0100    0.0038
##      5        1.2791             nan     0.0100    0.0032
##      6        1.2709             nan     0.0100    0.0039
##      7        1.2627             nan     0.0100    0.0039
##      8        1.2547             nan     0.0100    0.0035
##      9        1.2470             nan     0.0100    0.0038
##     10        1.2390             nan     0.0100    0.0036
##     20        1.1732             nan     0.0100    0.0029
##     40        1.0646             nan     0.0100    0.0020
##     60        0.9808             nan     0.0100    0.0014
##     80        0.9160             nan     0.0100    0.0013
##    100        0.8636             nan     0.0100    0.0010
##    120        0.8213             nan     0.0100    0.0006
##    140        0.7858             nan     0.0100    0.0006
##    160        0.7551             nan     0.0100    0.0003
##    180        0.7302             nan     0.0100    0.0003
##    200        0.7077             nan     0.0100    0.0002
##    220        0.6870             nan     0.0100    0.0002
##    240        0.6699             nan     0.0100    0.0000
##    260        0.6544             nan     0.0100    0.0001
##    280        0.6402             nan     0.0100    0.0001
##    300        0.6277             nan     0.0100    0.0000
##    320        0.6156             nan     0.0100   -0.0000
##    340        0.6039             nan     0.0100   -0.0001
##    360        0.5921             nan     0.0100    0.0000
##    380        0.5819             nan     0.0100   -0.0000
##    400        0.5713             nan     0.0100    0.0001
##    420        0.5621             nan     0.0100   -0.0001
##    440        0.5524             nan     0.0100   -0.0000
##    460        0.5430             nan     0.0100   -0.0000
##    480        0.5345             nan     0.0100   -0.0000
##    500        0.5261             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3128             nan     0.0100    0.0038
##      2        1.3045             nan     0.0100    0.0037
##      3        1.2967             nan     0.0100    0.0036
##      4        1.2886             nan     0.0100    0.0033
##      5        1.2801             nan     0.0100    0.0041
##      6        1.2729             nan     0.0100    0.0034
##      7        1.2652             nan     0.0100    0.0038
##      8        1.2581             nan     0.0100    0.0031
##      9        1.2502             nan     0.0100    0.0036
##     10        1.2435             nan     0.0100    0.0032
##     20        1.1751             nan     0.0100    0.0029
##     40        1.0669             nan     0.0100    0.0022
##     60        0.9828             nan     0.0100    0.0014
##     80        0.9167             nan     0.0100    0.0010
##    100        0.8652             nan     0.0100    0.0008
##    120        0.8216             nan     0.0100    0.0007
##    140        0.7876             nan     0.0100    0.0005
##    160        0.7590             nan     0.0100    0.0003
##    180        0.7342             nan     0.0100    0.0000
##    200        0.7125             nan     0.0100    0.0002
##    220        0.6945             nan     0.0100    0.0000
##    240        0.6775             nan     0.0100    0.0001
##    260        0.6623             nan     0.0100   -0.0001
##    280        0.6478             nan     0.0100    0.0001
##    300        0.6344             nan     0.0100   -0.0001
##    320        0.6223             nan     0.0100   -0.0001
##    340        0.6105             nan     0.0100    0.0001
##    360        0.6003             nan     0.0100   -0.0001
##    380        0.5901             nan     0.0100   -0.0000
##    400        0.5810             nan     0.0100   -0.0000
##    420        0.5715             nan     0.0100   -0.0001
##    440        0.5638             nan     0.0100    0.0001
##    460        0.5555             nan     0.0100    0.0000
##    480        0.5476             nan     0.0100   -0.0000
##    500        0.5399             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3017             nan     0.0100    0.0046
##      3        1.2927             nan     0.0100    0.0043
##      4        1.2845             nan     0.0100    0.0033
##      5        1.2755             nan     0.0100    0.0041
##      6        1.2670             nan     0.0100    0.0039
##      7        1.2587             nan     0.0100    0.0039
##      8        1.2502             nan     0.0100    0.0039
##      9        1.2419             nan     0.0100    0.0036
##     10        1.2350             nan     0.0100    0.0031
##     20        1.1629             nan     0.0100    0.0031
##     40        1.0474             nan     0.0100    0.0020
##     60        0.9599             nan     0.0100    0.0017
##     80        0.8905             nan     0.0100    0.0009
##    100        0.8373             nan     0.0100    0.0008
##    120        0.7937             nan     0.0100    0.0006
##    140        0.7572             nan     0.0100    0.0006
##    160        0.7253             nan     0.0100    0.0004
##    180        0.6979             nan     0.0100    0.0001
##    200        0.6743             nan     0.0100    0.0002
##    220        0.6530             nan     0.0100    0.0001
##    240        0.6337             nan     0.0100    0.0002
##    260        0.6164             nan     0.0100   -0.0001
##    280        0.6000             nan     0.0100    0.0001
##    300        0.5854             nan     0.0100    0.0001
##    320        0.5706             nan     0.0100    0.0002
##    340        0.5580             nan     0.0100    0.0001
##    360        0.5453             nan     0.0100   -0.0000
##    380        0.5320             nan     0.0100    0.0001
##    400        0.5210             nan     0.0100    0.0001
##    420        0.5102             nan     0.0100    0.0002
##    440        0.4999             nan     0.0100    0.0000
##    460        0.4903             nan     0.0100   -0.0001
##    480        0.4808             nan     0.0100   -0.0000
##    500        0.4720             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3107             nan     0.0100    0.0046
##      2        1.3016             nan     0.0100    0.0043
##      3        1.2925             nan     0.0100    0.0034
##      4        1.2841             nan     0.0100    0.0038
##      5        1.2753             nan     0.0100    0.0039
##      6        1.2670             nan     0.0100    0.0033
##      7        1.2588             nan     0.0100    0.0036
##      8        1.2514             nan     0.0100    0.0030
##      9        1.2429             nan     0.0100    0.0038
##     10        1.2338             nan     0.0100    0.0037
##     20        1.1625             nan     0.0100    0.0026
##     40        1.0489             nan     0.0100    0.0022
##     60        0.9607             nan     0.0100    0.0017
##     80        0.8948             nan     0.0100    0.0010
##    100        0.8406             nan     0.0100    0.0010
##    120        0.7963             nan     0.0100    0.0007
##    140        0.7606             nan     0.0100    0.0001
##    160        0.7303             nan     0.0100    0.0004
##    180        0.7029             nan     0.0100    0.0002
##    200        0.6790             nan     0.0100    0.0003
##    220        0.6582             nan     0.0100    0.0002
##    240        0.6396             nan     0.0100    0.0002
##    260        0.6231             nan     0.0100    0.0002
##    280        0.6074             nan     0.0100    0.0000
##    300        0.5927             nan     0.0100   -0.0001
##    320        0.5777             nan     0.0100    0.0000
##    340        0.5658             nan     0.0100    0.0000
##    360        0.5539             nan     0.0100    0.0002
##    380        0.5419             nan     0.0100    0.0000
##    400        0.5308             nan     0.0100    0.0001
##    420        0.5201             nan     0.0100    0.0001
##    440        0.5096             nan     0.0100   -0.0000
##    460        0.4997             nan     0.0100    0.0001
##    480        0.4912             nan     0.0100   -0.0001
##    500        0.4820             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0043
##      2        1.3023             nan     0.0100    0.0039
##      3        1.2941             nan     0.0100    0.0037
##      4        1.2851             nan     0.0100    0.0041
##      5        1.2769             nan     0.0100    0.0037
##      6        1.2682             nan     0.0100    0.0039
##      7        1.2600             nan     0.0100    0.0034
##      8        1.2521             nan     0.0100    0.0039
##      9        1.2442             nan     0.0100    0.0034
##     10        1.2364             nan     0.0100    0.0034
##     20        1.1671             nan     0.0100    0.0027
##     40        1.0524             nan     0.0100    0.0021
##     60        0.9658             nan     0.0100    0.0015
##     80        0.8994             nan     0.0100    0.0012
##    100        0.8458             nan     0.0100    0.0009
##    120        0.8017             nan     0.0100    0.0008
##    140        0.7646             nan     0.0100    0.0006
##    160        0.7332             nan     0.0100    0.0003
##    180        0.7081             nan     0.0100    0.0003
##    200        0.6832             nan     0.0100    0.0002
##    220        0.6618             nan     0.0100    0.0001
##    240        0.6450             nan     0.0100    0.0000
##    260        0.6287             nan     0.0100    0.0001
##    280        0.6140             nan     0.0100    0.0002
##    300        0.6006             nan     0.0100    0.0001
##    320        0.5878             nan     0.0100    0.0001
##    340        0.5757             nan     0.0100    0.0001
##    360        0.5634             nan     0.0100    0.0001
##    380        0.5512             nan     0.0100    0.0001
##    400        0.5405             nan     0.0100   -0.0001
##    420        0.5302             nan     0.0100    0.0000
##    440        0.5202             nan     0.0100   -0.0002
##    460        0.5105             nan     0.0100   -0.0001
##    480        0.5020             nan     0.0100   -0.0000
##    500        0.4934             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3013             nan     0.0100    0.0042
##      3        1.2915             nan     0.0100    0.0044
##      4        1.2825             nan     0.0100    0.0041
##      5        1.2745             nan     0.0100    0.0034
##      6        1.2649             nan     0.0100    0.0043
##      7        1.2563             nan     0.0100    0.0037
##      8        1.2475             nan     0.0100    0.0039
##      9        1.2384             nan     0.0100    0.0042
##     10        1.2300             nan     0.0100    0.0032
##     20        1.1562             nan     0.0100    0.0029
##     40        1.0377             nan     0.0100    0.0022
##     60        0.9459             nan     0.0100    0.0019
##     80        0.8741             nan     0.0100    0.0012
##    100        0.8170             nan     0.0100    0.0010
##    120        0.7698             nan     0.0100    0.0009
##    140        0.7298             nan     0.0100    0.0005
##    160        0.6962             nan     0.0100    0.0003
##    180        0.6677             nan     0.0100    0.0003
##    200        0.6436             nan     0.0100    0.0002
##    220        0.6210             nan     0.0100    0.0001
##    240        0.5999             nan     0.0100    0.0000
##    260        0.5815             nan     0.0100    0.0002
##    280        0.5647             nan     0.0100    0.0001
##    300        0.5484             nan     0.0100    0.0000
##    320        0.5333             nan     0.0100    0.0001
##    340        0.5197             nan     0.0100   -0.0001
##    360        0.5058             nan     0.0100   -0.0001
##    380        0.4936             nan     0.0100    0.0001
##    400        0.4815             nan     0.0100    0.0000
##    420        0.4704             nan     0.0100   -0.0000
##    440        0.4584             nan     0.0100   -0.0001
##    460        0.4472             nan     0.0100    0.0000
##    480        0.4379             nan     0.0100   -0.0000
##    500        0.4287             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0043
##      2        1.3006             nan     0.0100    0.0045
##      3        1.2910             nan     0.0100    0.0044
##      4        1.2825             nan     0.0100    0.0039
##      5        1.2739             nan     0.0100    0.0039
##      6        1.2649             nan     0.0100    0.0040
##      7        1.2559             nan     0.0100    0.0043
##      8        1.2478             nan     0.0100    0.0039
##      9        1.2393             nan     0.0100    0.0035
##     10        1.2315             nan     0.0100    0.0032
##     20        1.1578             nan     0.0100    0.0029
##     40        1.0380             nan     0.0100    0.0023
##     60        0.9469             nan     0.0100    0.0013
##     80        0.8761             nan     0.0100    0.0012
##    100        0.8180             nan     0.0100    0.0011
##    120        0.7719             nan     0.0100    0.0006
##    140        0.7343             nan     0.0100    0.0005
##    160        0.7021             nan     0.0100    0.0004
##    180        0.6742             nan     0.0100    0.0002
##    200        0.6497             nan     0.0100    0.0001
##    220        0.6277             nan     0.0100    0.0003
##    240        0.6069             nan     0.0100    0.0003
##    260        0.5883             nan     0.0100    0.0002
##    280        0.5709             nan     0.0100    0.0001
##    300        0.5554             nan     0.0100    0.0001
##    320        0.5397             nan     0.0100   -0.0001
##    340        0.5253             nan     0.0100    0.0000
##    360        0.5127             nan     0.0100    0.0000
##    380        0.5000             nan     0.0100    0.0000
##    400        0.4884             nan     0.0100    0.0002
##    420        0.4760             nan     0.0100    0.0001
##    440        0.4651             nan     0.0100    0.0000
##    460        0.4541             nan     0.0100   -0.0000
##    480        0.4436             nan     0.0100   -0.0000
##    500        0.4345             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0045
##      2        1.3015             nan     0.0100    0.0044
##      3        1.2925             nan     0.0100    0.0039
##      4        1.2837             nan     0.0100    0.0038
##      5        1.2748             nan     0.0100    0.0041
##      6        1.2662             nan     0.0100    0.0037
##      7        1.2577             nan     0.0100    0.0037
##      8        1.2495             nan     0.0100    0.0034
##      9        1.2406             nan     0.0100    0.0041
##     10        1.2322             nan     0.0100    0.0040
##     20        1.1593             nan     0.0100    0.0029
##     40        1.0397             nan     0.0100    0.0023
##     60        0.9512             nan     0.0100    0.0016
##     80        0.8820             nan     0.0100    0.0011
##    100        0.8264             nan     0.0100    0.0009
##    120        0.7809             nan     0.0100    0.0003
##    140        0.7425             nan     0.0100    0.0004
##    160        0.7099             nan     0.0100    0.0006
##    180        0.6828             nan     0.0100    0.0004
##    200        0.6597             nan     0.0100    0.0003
##    220        0.6382             nan     0.0100    0.0003
##    240        0.6182             nan     0.0100   -0.0000
##    260        0.5991             nan     0.0100    0.0003
##    280        0.5824             nan     0.0100   -0.0000
##    300        0.5667             nan     0.0100    0.0001
##    320        0.5520             nan     0.0100   -0.0000
##    340        0.5389             nan     0.0100    0.0001
##    360        0.5258             nan     0.0100   -0.0000
##    380        0.5128             nan     0.0100    0.0001
##    400        0.5015             nan     0.0100    0.0002
##    420        0.4906             nan     0.0100   -0.0001
##    440        0.4803             nan     0.0100    0.0001
##    460        0.4696             nan     0.0100   -0.0000
##    480        0.4597             nan     0.0100   -0.0000
##    500        0.4494             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2384             nan     0.1000    0.0431
##      2        1.1693             nan     0.1000    0.0291
##      3        1.1079             nan     0.1000    0.0285
##      4        1.0560             nan     0.1000    0.0217
##      5        1.0152             nan     0.1000    0.0170
##      6        0.9729             nan     0.1000    0.0166
##      7        0.9383             nan     0.1000    0.0139
##      8        0.9130             nan     0.1000    0.0087
##      9        0.8868             nan     0.1000    0.0110
##     10        0.8621             nan     0.1000    0.0093
##     20        0.7071             nan     0.1000    0.0022
##     40        0.5644             nan     0.1000   -0.0002
##     60        0.4820             nan     0.1000   -0.0009
##     80        0.4195             nan     0.1000   -0.0007
##    100        0.3751             nan     0.1000   -0.0011
##    120        0.3285             nan     0.1000   -0.0007
##    140        0.2930             nan     0.1000   -0.0002
##    160        0.2609             nan     0.1000   -0.0003
##    180        0.2329             nan     0.1000    0.0002
##    200        0.2109             nan     0.1000   -0.0008
##    220        0.1884             nan     0.1000   -0.0002
##    240        0.1725             nan     0.1000   -0.0001
##    260        0.1547             nan     0.1000   -0.0004
##    280        0.1406             nan     0.1000   -0.0002
##    300        0.1294             nan     0.1000   -0.0003
##    320        0.1181             nan     0.1000   -0.0004
##    340        0.1086             nan     0.1000   -0.0002
##    360        0.0996             nan     0.1000   -0.0005
##    380        0.0909             nan     0.1000   -0.0001
##    400        0.0833             nan     0.1000   -0.0005
##    420        0.0764             nan     0.1000   -0.0001
##    440        0.0696             nan     0.1000   -0.0001
##    460        0.0639             nan     0.1000   -0.0002
##    480        0.0587             nan     0.1000   -0.0002
##    500        0.0544             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2333             nan     0.1000    0.0358
##      2        1.1624             nan     0.1000    0.0333
##      3        1.1126             nan     0.1000    0.0234
##      4        1.0607             nan     0.1000    0.0211
##      5        1.0143             nan     0.1000    0.0187
##      6        0.9817             nan     0.1000    0.0141
##      7        0.9481             nan     0.1000    0.0151
##      8        0.9182             nan     0.1000    0.0105
##      9        0.8900             nan     0.1000    0.0109
##     10        0.8650             nan     0.1000    0.0093
##     20        0.7143             nan     0.1000    0.0023
##     40        0.5757             nan     0.1000    0.0006
##     60        0.4900             nan     0.1000   -0.0007
##     80        0.4296             nan     0.1000   -0.0011
##    100        0.3731             nan     0.1000   -0.0003
##    120        0.3336             nan     0.1000   -0.0012
##    140        0.2985             nan     0.1000   -0.0007
##    160        0.2714             nan     0.1000   -0.0006
##    180        0.2404             nan     0.1000   -0.0007
##    200        0.2179             nan     0.1000   -0.0009
##    220        0.1975             nan     0.1000   -0.0004
##    240        0.1801             nan     0.1000   -0.0003
##    260        0.1646             nan     0.1000   -0.0008
##    280        0.1505             nan     0.1000   -0.0004
##    300        0.1388             nan     0.1000   -0.0003
##    320        0.1265             nan     0.1000   -0.0005
##    340        0.1155             nan     0.1000   -0.0003
##    360        0.1065             nan     0.1000   -0.0001
##    380        0.0977             nan     0.1000    0.0001
##    400        0.0890             nan     0.1000   -0.0003
##    420        0.0812             nan     0.1000    0.0000
##    440        0.0745             nan     0.1000   -0.0002
##    460        0.0690             nan     0.1000   -0.0000
##    480        0.0633             nan     0.1000   -0.0002
##    500        0.0589             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2332             nan     0.1000    0.0364
##      2        1.1662             nan     0.1000    0.0274
##      3        1.1088             nan     0.1000    0.0243
##      4        1.0578             nan     0.1000    0.0252
##      5        1.0166             nan     0.1000    0.0174
##      6        0.9795             nan     0.1000    0.0148
##      7        0.9466             nan     0.1000    0.0149
##      8        0.9170             nan     0.1000    0.0116
##      9        0.8878             nan     0.1000    0.0113
##     10        0.8660             nan     0.1000    0.0071
##     20        0.7104             nan     0.1000    0.0038
##     40        0.5861             nan     0.1000   -0.0001
##     60        0.5076             nan     0.1000   -0.0020
##     80        0.4489             nan     0.1000   -0.0010
##    100        0.3963             nan     0.1000   -0.0013
##    120        0.3562             nan     0.1000   -0.0005
##    140        0.3132             nan     0.1000   -0.0007
##    160        0.2816             nan     0.1000   -0.0001
##    180        0.2546             nan     0.1000   -0.0008
##    200        0.2304             nan     0.1000   -0.0011
##    220        0.2077             nan     0.1000   -0.0005
##    240        0.1923             nan     0.1000   -0.0008
##    260        0.1766             nan     0.1000   -0.0005
##    280        0.1644             nan     0.1000   -0.0009
##    300        0.1503             nan     0.1000   -0.0007
##    320        0.1386             nan     0.1000   -0.0005
##    340        0.1275             nan     0.1000   -0.0004
##    360        0.1160             nan     0.1000   -0.0003
##    380        0.1081             nan     0.1000   -0.0003
##    400        0.0998             nan     0.1000   -0.0003
##    420        0.0924             nan     0.1000   -0.0003
##    440        0.0857             nan     0.1000   -0.0002
##    460        0.0789             nan     0.1000   -0.0001
##    480        0.0732             nan     0.1000   -0.0002
##    500        0.0676             nan     0.1000   -0.0004
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2305             nan     0.1000    0.0379
##      2        1.1606             nan     0.1000    0.0328
##      3        1.1010             nan     0.1000    0.0268
##      4        1.0459             nan     0.1000    0.0245
##      5        0.9947             nan     0.1000    0.0220
##      6        0.9569             nan     0.1000    0.0120
##      7        0.9196             nan     0.1000    0.0140
##      8        0.8895             nan     0.1000    0.0131
##      9        0.8597             nan     0.1000    0.0118
##     10        0.8343             nan     0.1000    0.0096
##     20        0.6732             nan     0.1000    0.0057
##     40        0.5263             nan     0.1000   -0.0015
##     60        0.4333             nan     0.1000    0.0011
##     80        0.3679             nan     0.1000    0.0002
##    100        0.3060             nan     0.1000   -0.0013
##    120        0.2614             nan     0.1000   -0.0001
##    140        0.2277             nan     0.1000   -0.0009
##    160        0.1969             nan     0.1000   -0.0007
##    180        0.1717             nan     0.1000   -0.0012
##    200        0.1511             nan     0.1000   -0.0003
##    220        0.1350             nan     0.1000   -0.0001
##    240        0.1209             nan     0.1000   -0.0003
##    260        0.1087             nan     0.1000   -0.0003
##    280        0.0968             nan     0.1000   -0.0004
##    300        0.0863             nan     0.1000    0.0001
##    320        0.0765             nan     0.1000   -0.0000
##    340        0.0694             nan     0.1000   -0.0003
##    360        0.0611             nan     0.1000    0.0000
##    380        0.0549             nan     0.1000   -0.0002
##    400        0.0492             nan     0.1000   -0.0002
##    420        0.0444             nan     0.1000   -0.0001
##    440        0.0402             nan     0.1000   -0.0002
##    460        0.0362             nan     0.1000   -0.0000
##    480        0.0326             nan     0.1000   -0.0000
##    500        0.0295             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2335             nan     0.1000    0.0402
##      2        1.1649             nan     0.1000    0.0314
##      3        1.0973             nan     0.1000    0.0310
##      4        1.0445             nan     0.1000    0.0232
##      5        0.9977             nan     0.1000    0.0190
##      6        0.9551             nan     0.1000    0.0182
##      7        0.9189             nan     0.1000    0.0118
##      8        0.8910             nan     0.1000    0.0090
##      9        0.8642             nan     0.1000    0.0111
##     10        0.8401             nan     0.1000    0.0089
##     20        0.6810             nan     0.1000    0.0015
##     40        0.5421             nan     0.1000   -0.0001
##     60        0.4520             nan     0.1000   -0.0003
##     80        0.3825             nan     0.1000   -0.0018
##    100        0.3270             nan     0.1000   -0.0006
##    120        0.2832             nan     0.1000   -0.0004
##    140        0.2428             nan     0.1000   -0.0007
##    160        0.2131             nan     0.1000   -0.0003
##    180        0.1859             nan     0.1000   -0.0007
##    200        0.1638             nan     0.1000   -0.0006
##    220        0.1429             nan     0.1000   -0.0007
##    240        0.1286             nan     0.1000   -0.0006
##    260        0.1132             nan     0.1000   -0.0000
##    280        0.0986             nan     0.1000   -0.0003
##    300        0.0888             nan     0.1000   -0.0002
##    320        0.0798             nan     0.1000   -0.0001
##    340        0.0712             nan     0.1000   -0.0003
##    360        0.0637             nan     0.1000   -0.0000
##    380        0.0579             nan     0.1000   -0.0002
##    400        0.0520             nan     0.1000   -0.0002
##    420        0.0470             nan     0.1000   -0.0002
##    440        0.0421             nan     0.1000   -0.0000
##    460        0.0377             nan     0.1000   -0.0001
##    480        0.0341             nan     0.1000   -0.0001
##    500        0.0307             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2342             nan     0.1000    0.0396
##      2        1.1614             nan     0.1000    0.0327
##      3        1.0967             nan     0.1000    0.0287
##      4        1.0451             nan     0.1000    0.0228
##      5        1.0012             nan     0.1000    0.0171
##      6        0.9591             nan     0.1000    0.0170
##      7        0.9218             nan     0.1000    0.0145
##      8        0.8938             nan     0.1000    0.0117
##      9        0.8646             nan     0.1000    0.0106
##     10        0.8433             nan     0.1000    0.0080
##     20        0.6923             nan     0.1000    0.0019
##     40        0.5484             nan     0.1000    0.0001
##     60        0.4579             nan     0.1000   -0.0005
##     80        0.3872             nan     0.1000   -0.0007
##    100        0.3368             nan     0.1000   -0.0004
##    120        0.2916             nan     0.1000    0.0001
##    140        0.2577             nan     0.1000   -0.0004
##    160        0.2281             nan     0.1000   -0.0006
##    180        0.1987             nan     0.1000   -0.0009
##    200        0.1746             nan     0.1000   -0.0006
##    220        0.1562             nan     0.1000   -0.0003
##    240        0.1404             nan     0.1000   -0.0006
##    260        0.1261             nan     0.1000   -0.0002
##    280        0.1114             nan     0.1000   -0.0004
##    300        0.1003             nan     0.1000   -0.0005
##    320        0.0907             nan     0.1000   -0.0005
##    340        0.0813             nan     0.1000   -0.0003
##    360        0.0726             nan     0.1000   -0.0000
##    380        0.0651             nan     0.1000   -0.0000
##    400        0.0587             nan     0.1000   -0.0004
##    420        0.0531             nan     0.1000   -0.0002
##    440        0.0478             nan     0.1000   -0.0001
##    460        0.0431             nan     0.1000   -0.0002
##    480        0.0395             nan     0.1000   -0.0003
##    500        0.0359             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2256             nan     0.1000    0.0448
##      2        1.1526             nan     0.1000    0.0362
##      3        1.0851             nan     0.1000    0.0291
##      4        1.0251             nan     0.1000    0.0263
##      5        0.9742             nan     0.1000    0.0198
##      6        0.9347             nan     0.1000    0.0174
##      7        0.8966             nan     0.1000    0.0163
##      8        0.8654             nan     0.1000    0.0133
##      9        0.8368             nan     0.1000    0.0103
##     10        0.8085             nan     0.1000    0.0093
##     20        0.6385             nan     0.1000    0.0035
##     40        0.4912             nan     0.1000    0.0003
##     60        0.3941             nan     0.1000   -0.0011
##     80        0.3160             nan     0.1000    0.0004
##    100        0.2621             nan     0.1000    0.0001
##    120        0.2186             nan     0.1000   -0.0005
##    140        0.1877             nan     0.1000   -0.0004
##    160        0.1568             nan     0.1000   -0.0004
##    180        0.1366             nan     0.1000   -0.0004
##    200        0.1178             nan     0.1000   -0.0004
##    220        0.1000             nan     0.1000   -0.0001
##    240        0.0871             nan     0.1000   -0.0002
##    260        0.0771             nan     0.1000   -0.0004
##    280        0.0682             nan     0.1000   -0.0003
##    300        0.0597             nan     0.1000    0.0000
##    320        0.0518             nan     0.1000   -0.0000
##    340        0.0453             nan     0.1000   -0.0001
##    360        0.0392             nan     0.1000   -0.0001
##    380        0.0345             nan     0.1000    0.0001
##    400        0.0302             nan     0.1000   -0.0000
##    420        0.0266             nan     0.1000   -0.0000
##    440        0.0233             nan     0.1000   -0.0001
##    460        0.0205             nan     0.1000   -0.0000
##    480        0.0179             nan     0.1000   -0.0001
##    500        0.0159             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2242             nan     0.1000    0.0420
##      2        1.1512             nan     0.1000    0.0342
##      3        1.0894             nan     0.1000    0.0290
##      4        1.0329             nan     0.1000    0.0238
##      5        0.9820             nan     0.1000    0.0205
##      6        0.9435             nan     0.1000    0.0146
##      7        0.9106             nan     0.1000    0.0125
##      8        0.8719             nan     0.1000    0.0148
##      9        0.8426             nan     0.1000    0.0131
##     10        0.8164             nan     0.1000    0.0092
##     20        0.6527             nan     0.1000    0.0023
##     40        0.4884             nan     0.1000   -0.0002
##     60        0.4010             nan     0.1000    0.0013
##     80        0.3218             nan     0.1000   -0.0002
##    100        0.2652             nan     0.1000   -0.0010
##    120        0.2301             nan     0.1000   -0.0010
##    140        0.1940             nan     0.1000   -0.0005
##    160        0.1695             nan     0.1000   -0.0006
##    180        0.1452             nan     0.1000   -0.0006
##    200        0.1241             nan     0.1000   -0.0002
##    220        0.1071             nan     0.1000   -0.0006
##    240        0.0943             nan     0.1000    0.0001
##    260        0.0827             nan     0.1000   -0.0005
##    280        0.0729             nan     0.1000   -0.0006
##    300        0.0635             nan     0.1000   -0.0003
##    320        0.0553             nan     0.1000   -0.0002
##    340        0.0489             nan     0.1000   -0.0001
##    360        0.0429             nan     0.1000    0.0000
##    380        0.0374             nan     0.1000   -0.0001
##    400        0.0325             nan     0.1000   -0.0001
##    420        0.0288             nan     0.1000   -0.0001
##    440        0.0254             nan     0.1000   -0.0000
##    460        0.0225             nan     0.1000   -0.0001
##    480        0.0199             nan     0.1000   -0.0001
##    500        0.0178             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2377             nan     0.1000    0.0423
##      2        1.1583             nan     0.1000    0.0353
##      3        1.0942             nan     0.1000    0.0261
##      4        1.0363             nan     0.1000    0.0253
##      5        0.9896             nan     0.1000    0.0192
##      6        0.9486             nan     0.1000    0.0170
##      7        0.9115             nan     0.1000    0.0150
##      8        0.8810             nan     0.1000    0.0126
##      9        0.8504             nan     0.1000    0.0116
##     10        0.8206             nan     0.1000    0.0121
##     20        0.6615             nan     0.1000    0.0024
##     40        0.5043             nan     0.1000   -0.0009
##     60        0.4036             nan     0.1000   -0.0005
##     80        0.3387             nan     0.1000   -0.0008
##    100        0.2872             nan     0.1000   -0.0013
##    120        0.2439             nan     0.1000    0.0003
##    140        0.2083             nan     0.1000   -0.0005
##    160        0.1770             nan     0.1000   -0.0005
##    180        0.1524             nan     0.1000   -0.0005
##    200        0.1315             nan     0.1000   -0.0004
##    220        0.1153             nan     0.1000   -0.0004
##    240        0.1014             nan     0.1000   -0.0004
##    260        0.0891             nan     0.1000   -0.0004
##    280        0.0792             nan     0.1000   -0.0001
##    300        0.0693             nan     0.1000   -0.0005
##    320        0.0602             nan     0.1000   -0.0001
##    340        0.0534             nan     0.1000   -0.0003
##    360        0.0472             nan     0.1000   -0.0002
##    380        0.0418             nan     0.1000   -0.0001
##    400        0.0370             nan     0.1000   -0.0001
##    420        0.0333             nan     0.1000   -0.0002
##    440        0.0294             nan     0.1000   -0.0001
##    460        0.0260             nan     0.1000   -0.0000
##    480        0.0229             nan     0.1000   -0.0001
##    500        0.0206             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0003
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0003
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0003
##     20        1.3044             nan     0.0010    0.0003
##     40        1.2881             nan     0.0010    0.0003
##     60        1.2721             nan     0.0010    0.0003
##     80        1.2566             nan     0.0010    0.0003
##    100        1.2417             nan     0.0010    0.0003
##    120        1.2275             nan     0.0010    0.0003
##    140        1.2137             nan     0.0010    0.0003
##    160        1.2002             nan     0.0010    0.0003
##    180        1.1876             nan     0.0010    0.0003
##    200        1.1752             nan     0.0010    0.0002
##    220        1.1629             nan     0.0010    0.0003
##    240        1.1507             nan     0.0010    0.0003
##    260        1.1392             nan     0.0010    0.0003
##    280        1.1278             nan     0.0010    0.0002
##    300        1.1172             nan     0.0010    0.0002
##    320        1.1065             nan     0.0010    0.0002
##    340        1.0960             nan     0.0010    0.0002
##    360        1.0860             nan     0.0010    0.0002
##    380        1.0762             nan     0.0010    0.0002
##    400        1.0667             nan     0.0010    0.0002
##    420        1.0573             nan     0.0010    0.0002
##    440        1.0482             nan     0.0010    0.0002
##    460        1.0394             nan     0.0010    0.0002
##    480        1.0310             nan     0.0010    0.0001
##    500        1.0228             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0003
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0003
##     20        1.3041             nan     0.0010    0.0004
##     40        1.2877             nan     0.0010    0.0003
##     60        1.2719             nan     0.0010    0.0004
##     80        1.2566             nan     0.0010    0.0004
##    100        1.2416             nan     0.0010    0.0003
##    120        1.2276             nan     0.0010    0.0003
##    140        1.2137             nan     0.0010    0.0003
##    160        1.2003             nan     0.0010    0.0003
##    180        1.1872             nan     0.0010    0.0003
##    200        1.1743             nan     0.0010    0.0003
##    220        1.1619             nan     0.0010    0.0003
##    240        1.1498             nan     0.0010    0.0003
##    260        1.1386             nan     0.0010    0.0003
##    280        1.1274             nan     0.0010    0.0003
##    300        1.1164             nan     0.0010    0.0003
##    320        1.1061             nan     0.0010    0.0002
##    340        1.0960             nan     0.0010    0.0002
##    360        1.0858             nan     0.0010    0.0003
##    380        1.0759             nan     0.0010    0.0002
##    400        1.0663             nan     0.0010    0.0002
##    420        1.0571             nan     0.0010    0.0002
##    440        1.0481             nan     0.0010    0.0002
##    460        1.0393             nan     0.0010    0.0002
##    480        1.0307             nan     0.0010    0.0002
##    500        1.0221             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3196             nan     0.0010    0.0003
##      3        1.3187             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3161             nan     0.0010    0.0004
##      7        1.3154             nan     0.0010    0.0003
##      8        1.3145             nan     0.0010    0.0004
##      9        1.3136             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3044             nan     0.0010    0.0004
##     40        1.2885             nan     0.0010    0.0003
##     60        1.2727             nan     0.0010    0.0003
##     80        1.2576             nan     0.0010    0.0003
##    100        1.2428             nan     0.0010    0.0003
##    120        1.2288             nan     0.0010    0.0003
##    140        1.2149             nan     0.0010    0.0003
##    160        1.2015             nan     0.0010    0.0003
##    180        1.1883             nan     0.0010    0.0003
##    200        1.1761             nan     0.0010    0.0003
##    220        1.1638             nan     0.0010    0.0003
##    240        1.1521             nan     0.0010    0.0003
##    260        1.1405             nan     0.0010    0.0002
##    280        1.1294             nan     0.0010    0.0003
##    300        1.1187             nan     0.0010    0.0002
##    320        1.1081             nan     0.0010    0.0002
##    340        1.0979             nan     0.0010    0.0003
##    360        1.0881             nan     0.0010    0.0002
##    380        1.0783             nan     0.0010    0.0002
##    400        1.0690             nan     0.0010    0.0002
##    420        1.0598             nan     0.0010    0.0002
##    440        1.0509             nan     0.0010    0.0002
##    460        1.0420             nan     0.0010    0.0002
##    480        1.0335             nan     0.0010    0.0002
##    500        1.0253             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0005
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2682             nan     0.0010    0.0003
##     80        1.2521             nan     0.0010    0.0003
##    100        1.2361             nan     0.0010    0.0004
##    120        1.2204             nan     0.0010    0.0003
##    140        1.2053             nan     0.0010    0.0003
##    160        1.1907             nan     0.0010    0.0003
##    180        1.1769             nan     0.0010    0.0003
##    200        1.1636             nan     0.0010    0.0003
##    220        1.1503             nan     0.0010    0.0003
##    240        1.1374             nan     0.0010    0.0002
##    260        1.1252             nan     0.0010    0.0002
##    280        1.1132             nan     0.0010    0.0003
##    300        1.1015             nan     0.0010    0.0003
##    320        1.0903             nan     0.0010    0.0002
##    340        1.0795             nan     0.0010    0.0002
##    360        1.0688             nan     0.0010    0.0002
##    380        1.0584             nan     0.0010    0.0002
##    400        1.0485             nan     0.0010    0.0002
##    420        1.0386             nan     0.0010    0.0002
##    440        1.0292             nan     0.0010    0.0002
##    460        1.0199             nan     0.0010    0.0002
##    480        1.0108             nan     0.0010    0.0002
##    500        1.0024             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0005
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0004
##     80        1.2519             nan     0.0010    0.0004
##    100        1.2362             nan     0.0010    0.0004
##    120        1.2211             nan     0.0010    0.0004
##    140        1.2064             nan     0.0010    0.0003
##    160        1.1919             nan     0.0010    0.0003
##    180        1.1778             nan     0.0010    0.0003
##    200        1.1641             nan     0.0010    0.0003
##    220        1.1510             nan     0.0010    0.0003
##    240        1.1381             nan     0.0010    0.0003
##    260        1.1260             nan     0.0010    0.0003
##    280        1.1141             nan     0.0010    0.0003
##    300        1.1024             nan     0.0010    0.0003
##    320        1.0912             nan     0.0010    0.0002
##    340        1.0801             nan     0.0010    0.0002
##    360        1.0695             nan     0.0010    0.0002
##    380        1.0592             nan     0.0010    0.0002
##    400        1.0493             nan     0.0010    0.0002
##    420        1.0396             nan     0.0010    0.0002
##    440        1.0302             nan     0.0010    0.0002
##    460        1.0210             nan     0.0010    0.0002
##    480        1.0120             nan     0.0010    0.0002
##    500        1.0032             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0005
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2861             nan     0.0010    0.0003
##     60        1.2694             nan     0.0010    0.0004
##     80        1.2528             nan     0.0010    0.0004
##    100        1.2374             nan     0.0010    0.0004
##    120        1.2225             nan     0.0010    0.0003
##    140        1.2077             nan     0.0010    0.0003
##    160        1.1938             nan     0.0010    0.0003
##    180        1.1802             nan     0.0010    0.0003
##    200        1.1671             nan     0.0010    0.0003
##    220        1.1542             nan     0.0010    0.0003
##    240        1.1416             nan     0.0010    0.0003
##    260        1.1296             nan     0.0010    0.0003
##    280        1.1178             nan     0.0010    0.0002
##    300        1.1065             nan     0.0010    0.0002
##    320        1.0955             nan     0.0010    0.0002
##    340        1.0846             nan     0.0010    0.0002
##    360        1.0741             nan     0.0010    0.0002
##    380        1.0638             nan     0.0010    0.0003
##    400        1.0540             nan     0.0010    0.0002
##    420        1.0442             nan     0.0010    0.0002
##    440        1.0349             nan     0.0010    0.0002
##    460        1.0257             nan     0.0010    0.0002
##    480        1.0169             nan     0.0010    0.0001
##    500        1.0081             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0004
##     60        1.2660             nan     0.0010    0.0004
##     80        1.2487             nan     0.0010    0.0004
##    100        1.2321             nan     0.0010    0.0004
##    120        1.2159             nan     0.0010    0.0004
##    140        1.2001             nan     0.0010    0.0003
##    160        1.1848             nan     0.0010    0.0003
##    180        1.1703             nan     0.0010    0.0003
##    200        1.1565             nan     0.0010    0.0003
##    220        1.1432             nan     0.0010    0.0002
##    240        1.1300             nan     0.0010    0.0003
##    260        1.1170             nan     0.0010    0.0003
##    280        1.1048             nan     0.0010    0.0003
##    300        1.0927             nan     0.0010    0.0003
##    320        1.0811             nan     0.0010    0.0003
##    340        1.0697             nan     0.0010    0.0002
##    360        1.0587             nan     0.0010    0.0003
##    380        1.0480             nan     0.0010    0.0002
##    400        1.0375             nan     0.0010    0.0002
##    420        1.0273             nan     0.0010    0.0002
##    440        1.0172             nan     0.0010    0.0002
##    460        1.0074             nan     0.0010    0.0002
##    480        0.9981             nan     0.0010    0.0002
##    500        0.9888             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0005
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0004
##     60        1.2658             nan     0.0010    0.0004
##     80        1.2488             nan     0.0010    0.0004
##    100        1.2322             nan     0.0010    0.0004
##    120        1.2163             nan     0.0010    0.0004
##    140        1.2009             nan     0.0010    0.0003
##    160        1.1859             nan     0.0010    0.0004
##    180        1.1717             nan     0.0010    0.0003
##    200        1.1575             nan     0.0010    0.0003
##    220        1.1442             nan     0.0010    0.0003
##    240        1.1310             nan     0.0010    0.0003
##    260        1.1183             nan     0.0010    0.0003
##    280        1.1061             nan     0.0010    0.0003
##    300        1.0943             nan     0.0010    0.0003
##    320        1.0826             nan     0.0010    0.0002
##    340        1.0711             nan     0.0010    0.0002
##    360        1.0600             nan     0.0010    0.0002
##    380        1.0492             nan     0.0010    0.0002
##    400        1.0387             nan     0.0010    0.0002
##    420        1.0287             nan     0.0010    0.0002
##    440        1.0187             nan     0.0010    0.0002
##    460        1.0091             nan     0.0010    0.0002
##    480        0.9994             nan     0.0010    0.0002
##    500        0.9904             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0005
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0005
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0005
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2672             nan     0.0010    0.0004
##     80        1.2503             nan     0.0010    0.0004
##    100        1.2344             nan     0.0010    0.0004
##    120        1.2185             nan     0.0010    0.0003
##    140        1.2034             nan     0.0010    0.0003
##    160        1.1887             nan     0.0010    0.0002
##    180        1.1744             nan     0.0010    0.0004
##    200        1.1603             nan     0.0010    0.0003
##    220        1.1469             nan     0.0010    0.0003
##    240        1.1337             nan     0.0010    0.0003
##    260        1.1215             nan     0.0010    0.0002
##    280        1.1095             nan     0.0010    0.0003
##    300        1.0980             nan     0.0010    0.0003
##    320        1.0866             nan     0.0010    0.0002
##    340        1.0755             nan     0.0010    0.0002
##    360        1.0645             nan     0.0010    0.0002
##    380        1.0538             nan     0.0010    0.0002
##    400        1.0438             nan     0.0010    0.0002
##    420        1.0338             nan     0.0010    0.0002
##    440        1.0240             nan     0.0010    0.0002
##    460        1.0145             nan     0.0010    0.0002
##    480        1.0051             nan     0.0010    0.0002
##    500        0.9961             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0036
##      2        1.3043             nan     0.0100    0.0038
##      3        1.2965             nan     0.0100    0.0037
##      4        1.2880             nan     0.0100    0.0040
##      5        1.2797             nan     0.0100    0.0034
##      6        1.2715             nan     0.0100    0.0037
##      7        1.2641             nan     0.0100    0.0032
##      8        1.2567             nan     0.0100    0.0032
##      9        1.2483             nan     0.0100    0.0040
##     10        1.2410             nan     0.0100    0.0034
##     20        1.1743             nan     0.0100    0.0029
##     40        1.0676             nan     0.0100    0.0020
##     60        0.9848             nan     0.0100    0.0016
##     80        0.9201             nan     0.0100    0.0009
##    100        0.8681             nan     0.0100    0.0008
##    120        0.8241             nan     0.0100    0.0008
##    140        0.7877             nan     0.0100    0.0004
##    160        0.7589             nan     0.0100    0.0003
##    180        0.7327             nan     0.0100    0.0004
##    200        0.7108             nan     0.0100    0.0003
##    220        0.6915             nan     0.0100    0.0002
##    240        0.6733             nan     0.0100    0.0002
##    260        0.6579             nan     0.0100   -0.0000
##    280        0.6437             nan     0.0100    0.0000
##    300        0.6294             nan     0.0100   -0.0000
##    320        0.6175             nan     0.0100    0.0000
##    340        0.6059             nan     0.0100   -0.0000
##    360        0.5942             nan     0.0100    0.0001
##    380        0.5831             nan     0.0100   -0.0001
##    400        0.5727             nan     0.0100    0.0000
##    420        0.5636             nan     0.0100   -0.0001
##    440        0.5550             nan     0.0100   -0.0001
##    460        0.5456             nan     0.0100   -0.0002
##    480        0.5364             nan     0.0100   -0.0000
##    500        0.5283             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3038             nan     0.0100    0.0040
##      3        1.2952             nan     0.0100    0.0039
##      4        1.2862             nan     0.0100    0.0039
##      5        1.2779             nan     0.0100    0.0039
##      6        1.2701             nan     0.0100    0.0036
##      7        1.2626             nan     0.0100    0.0036
##      8        1.2554             nan     0.0100    0.0037
##      9        1.2484             nan     0.0100    0.0033
##     10        1.2413             nan     0.0100    0.0031
##     20        1.1728             nan     0.0100    0.0028
##     40        1.0656             nan     0.0100    0.0022
##     60        0.9837             nan     0.0100    0.0014
##     80        0.9212             nan     0.0100    0.0010
##    100        0.8692             nan     0.0100    0.0009
##    120        0.8254             nan     0.0100    0.0005
##    140        0.7913             nan     0.0100    0.0005
##    160        0.7616             nan     0.0100    0.0003
##    180        0.7357             nan     0.0100    0.0002
##    200        0.7144             nan     0.0100    0.0002
##    220        0.6945             nan     0.0100    0.0003
##    240        0.6772             nan     0.0100    0.0000
##    260        0.6614             nan     0.0100    0.0001
##    280        0.6469             nan     0.0100    0.0003
##    300        0.6330             nan     0.0100    0.0001
##    320        0.6201             nan     0.0100    0.0001
##    340        0.6074             nan     0.0100    0.0001
##    360        0.5968             nan     0.0100   -0.0001
##    380        0.5868             nan     0.0100    0.0001
##    400        0.5779             nan     0.0100   -0.0002
##    420        0.5682             nan     0.0100    0.0000
##    440        0.5583             nan     0.0100   -0.0001
##    460        0.5498             nan     0.0100    0.0000
##    480        0.5411             nan     0.0100   -0.0002
##    500        0.5338             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0044
##      2        1.3046             nan     0.0100    0.0032
##      3        1.2964             nan     0.0100    0.0032
##      4        1.2894             nan     0.0100    0.0032
##      5        1.2816             nan     0.0100    0.0035
##      6        1.2739             nan     0.0100    0.0037
##      7        1.2664             nan     0.0100    0.0032
##      8        1.2587             nan     0.0100    0.0037
##      9        1.2519             nan     0.0100    0.0032
##     10        1.2443             nan     0.0100    0.0034
##     20        1.1771             nan     0.0100    0.0028
##     40        1.0681             nan     0.0100    0.0016
##     60        0.9860             nan     0.0100    0.0016
##     80        0.9215             nan     0.0100    0.0015
##    100        0.8694             nan     0.0100    0.0008
##    120        0.8269             nan     0.0100    0.0008
##    140        0.7922             nan     0.0100    0.0005
##    160        0.7633             nan     0.0100    0.0004
##    180        0.7372             nan     0.0100    0.0002
##    200        0.7129             nan     0.0100    0.0004
##    220        0.6939             nan     0.0100    0.0002
##    240        0.6763             nan     0.0100    0.0002
##    260        0.6600             nan     0.0100    0.0001
##    280        0.6461             nan     0.0100    0.0000
##    300        0.6329             nan     0.0100   -0.0000
##    320        0.6206             nan     0.0100    0.0001
##    340        0.6094             nan     0.0100   -0.0002
##    360        0.5998             nan     0.0100   -0.0001
##    380        0.5903             nan     0.0100   -0.0001
##    400        0.5812             nan     0.0100   -0.0001
##    420        0.5719             nan     0.0100    0.0000
##    440        0.5634             nan     0.0100   -0.0001
##    460        0.5549             nan     0.0100    0.0001
##    480        0.5470             nan     0.0100   -0.0001
##    500        0.5386             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0041
##      2        1.3030             nan     0.0100    0.0045
##      3        1.2942             nan     0.0100    0.0040
##      4        1.2843             nan     0.0100    0.0040
##      5        1.2755             nan     0.0100    0.0037
##      6        1.2678             nan     0.0100    0.0032
##      7        1.2593             nan     0.0100    0.0036
##      8        1.2509             nan     0.0100    0.0038
##      9        1.2436             nan     0.0100    0.0036
##     10        1.2358             nan     0.0100    0.0037
##     20        1.1614             nan     0.0100    0.0035
##     40        1.0484             nan     0.0100    0.0022
##     60        0.9598             nan     0.0100    0.0015
##     80        0.8928             nan     0.0100    0.0014
##    100        0.8391             nan     0.0100    0.0008
##    120        0.7932             nan     0.0100    0.0008
##    140        0.7565             nan     0.0100    0.0004
##    160        0.7264             nan     0.0100    0.0004
##    180        0.6991             nan     0.0100    0.0003
##    200        0.6760             nan     0.0100    0.0004
##    220        0.6546             nan     0.0100    0.0002
##    240        0.6358             nan     0.0100    0.0000
##    260        0.6181             nan     0.0100    0.0001
##    280        0.6025             nan     0.0100   -0.0001
##    300        0.5877             nan     0.0100    0.0001
##    320        0.5743             nan     0.0100    0.0002
##    340        0.5621             nan     0.0100   -0.0001
##    360        0.5490             nan     0.0100    0.0001
##    380        0.5367             nan     0.0100    0.0001
##    400        0.5252             nan     0.0100   -0.0001
##    420        0.5144             nan     0.0100   -0.0001
##    440        0.5033             nan     0.0100    0.0000
##    460        0.4939             nan     0.0100   -0.0001
##    480        0.4845             nan     0.0100    0.0001
##    500        0.4752             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0045
##      2        1.3028             nan     0.0100    0.0043
##      3        1.2949             nan     0.0100    0.0038
##      4        1.2857             nan     0.0100    0.0045
##      5        1.2778             nan     0.0100    0.0039
##      6        1.2702             nan     0.0100    0.0033
##      7        1.2616             nan     0.0100    0.0039
##      8        1.2531             nan     0.0100    0.0035
##      9        1.2456             nan     0.0100    0.0032
##     10        1.2370             nan     0.0100    0.0035
##     20        1.1643             nan     0.0100    0.0025
##     40        1.0497             nan     0.0100    0.0022
##     60        0.9622             nan     0.0100    0.0016
##     80        0.8935             nan     0.0100    0.0010
##    100        0.8390             nan     0.0100    0.0010
##    120        0.7947             nan     0.0100    0.0008
##    140        0.7577             nan     0.0100    0.0006
##    160        0.7267             nan     0.0100    0.0003
##    180        0.7005             nan     0.0100    0.0002
##    200        0.6763             nan     0.0100    0.0003
##    220        0.6556             nan     0.0100    0.0003
##    240        0.6364             nan     0.0100    0.0001
##    260        0.6203             nan     0.0100    0.0003
##    280        0.6040             nan     0.0100    0.0000
##    300        0.5900             nan     0.0100    0.0001
##    320        0.5762             nan     0.0100   -0.0001
##    340        0.5636             nan     0.0100   -0.0000
##    360        0.5513             nan     0.0100    0.0000
##    380        0.5399             nan     0.0100   -0.0001
##    400        0.5287             nan     0.0100   -0.0000
##    420        0.5182             nan     0.0100   -0.0000
##    440        0.5082             nan     0.0100   -0.0001
##    460        0.4987             nan     0.0100    0.0000
##    480        0.4899             nan     0.0100    0.0001
##    500        0.4813             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0045
##      2        1.3019             nan     0.0100    0.0045
##      3        1.2925             nan     0.0100    0.0041
##      4        1.2833             nan     0.0100    0.0042
##      5        1.2749             nan     0.0100    0.0038
##      6        1.2671             nan     0.0100    0.0037
##      7        1.2588             nan     0.0100    0.0034
##      8        1.2511             nan     0.0100    0.0035
##      9        1.2438             nan     0.0100    0.0032
##     10        1.2363             nan     0.0100    0.0032
##     20        1.1650             nan     0.0100    0.0032
##     40        1.0524             nan     0.0100    0.0019
##     60        0.9662             nan     0.0100    0.0017
##     80        0.8994             nan     0.0100    0.0012
##    100        0.8446             nan     0.0100    0.0010
##    120        0.7999             nan     0.0100    0.0006
##    140        0.7637             nan     0.0100    0.0005
##    160        0.7333             nan     0.0100    0.0005
##    180        0.7072             nan     0.0100    0.0003
##    200        0.6844             nan     0.0100    0.0002
##    220        0.6637             nan     0.0100    0.0002
##    240        0.6435             nan     0.0100    0.0001
##    260        0.6270             nan     0.0100    0.0002
##    280        0.6127             nan     0.0100    0.0001
##    300        0.5982             nan     0.0100   -0.0001
##    320        0.5849             nan     0.0100    0.0001
##    340        0.5713             nan     0.0100   -0.0001
##    360        0.5592             nan     0.0100   -0.0001
##    380        0.5485             nan     0.0100   -0.0001
##    400        0.5370             nan     0.0100    0.0001
##    420        0.5280             nan     0.0100   -0.0000
##    440        0.5181             nan     0.0100    0.0000
##    460        0.5093             nan     0.0100   -0.0000
##    480        0.4997             nan     0.0100    0.0000
##    500        0.4911             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0047
##      2        1.3013             nan     0.0100    0.0045
##      3        1.2932             nan     0.0100    0.0035
##      4        1.2838             nan     0.0100    0.0045
##      5        1.2754             nan     0.0100    0.0035
##      6        1.2657             nan     0.0100    0.0047
##      7        1.2570             nan     0.0100    0.0039
##      8        1.2489             nan     0.0100    0.0035
##      9        1.2413             nan     0.0100    0.0036
##     10        1.2336             nan     0.0100    0.0032
##     20        1.1599             nan     0.0100    0.0027
##     40        1.0382             nan     0.0100    0.0022
##     60        0.9469             nan     0.0100    0.0017
##     80        0.8752             nan     0.0100    0.0015
##    100        0.8192             nan     0.0100    0.0009
##    120        0.7730             nan     0.0100    0.0006
##    140        0.7325             nan     0.0100    0.0006
##    160        0.6996             nan     0.0100    0.0006
##    180        0.6705             nan     0.0100    0.0004
##    200        0.6442             nan     0.0100    0.0003
##    220        0.6215             nan     0.0100    0.0000
##    240        0.6020             nan     0.0100    0.0001
##    260        0.5830             nan     0.0100   -0.0000
##    280        0.5658             nan     0.0100   -0.0002
##    300        0.5506             nan     0.0100    0.0001
##    320        0.5350             nan     0.0100    0.0002
##    340        0.5211             nan     0.0100   -0.0000
##    360        0.5081             nan     0.0100    0.0000
##    380        0.4948             nan     0.0100   -0.0000
##    400        0.4831             nan     0.0100   -0.0000
##    420        0.4717             nan     0.0100    0.0000
##    440        0.4607             nan     0.0100    0.0000
##    460        0.4507             nan     0.0100   -0.0001
##    480        0.4407             nan     0.0100   -0.0000
##    500        0.4304             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3107             nan     0.0100    0.0048
##      2        1.3005             nan     0.0100    0.0048
##      3        1.2913             nan     0.0100    0.0039
##      4        1.2819             nan     0.0100    0.0043
##      5        1.2734             nan     0.0100    0.0043
##      6        1.2640             nan     0.0100    0.0039
##      7        1.2549             nan     0.0100    0.0037
##      8        1.2461             nan     0.0100    0.0038
##      9        1.2379             nan     0.0100    0.0036
##     10        1.2301             nan     0.0100    0.0035
##     20        1.1560             nan     0.0100    0.0034
##     40        1.0385             nan     0.0100    0.0022
##     60        0.9476             nan     0.0100    0.0017
##     80        0.8764             nan     0.0100    0.0014
##    100        0.8208             nan     0.0100    0.0008
##    120        0.7747             nan     0.0100    0.0007
##    140        0.7360             nan     0.0100    0.0004
##    160        0.7032             nan     0.0100    0.0005
##    180        0.6747             nan     0.0100    0.0004
##    200        0.6506             nan     0.0100    0.0002
##    220        0.6271             nan     0.0100    0.0004
##    240        0.6070             nan     0.0100    0.0001
##    260        0.5893             nan     0.0100    0.0001
##    280        0.5725             nan     0.0100    0.0001
##    300        0.5565             nan     0.0100    0.0000
##    320        0.5415             nan     0.0100    0.0000
##    340        0.5280             nan     0.0100   -0.0000
##    360        0.5148             nan     0.0100    0.0000
##    380        0.5035             nan     0.0100    0.0001
##    400        0.4914             nan     0.0100    0.0001
##    420        0.4796             nan     0.0100    0.0000
##    440        0.4688             nan     0.0100   -0.0001
##    460        0.4580             nan     0.0100   -0.0000
##    480        0.4482             nan     0.0100    0.0001
##    500        0.4381             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0047
##      2        1.3022             nan     0.0100    0.0042
##      3        1.2925             nan     0.0100    0.0046
##      4        1.2834             nan     0.0100    0.0040
##      5        1.2746             nan     0.0100    0.0040
##      6        1.2655             nan     0.0100    0.0040
##      7        1.2571             nan     0.0100    0.0038
##      8        1.2488             nan     0.0100    0.0035
##      9        1.2408             nan     0.0100    0.0040
##     10        1.2332             nan     0.0100    0.0035
##     20        1.1613             nan     0.0100    0.0028
##     40        1.0422             nan     0.0100    0.0022
##     60        0.9529             nan     0.0100    0.0017
##     80        0.8836             nan     0.0100    0.0010
##    100        0.8282             nan     0.0100    0.0009
##    120        0.7836             nan     0.0100    0.0005
##    140        0.7467             nan     0.0100    0.0005
##    160        0.7130             nan     0.0100    0.0004
##    180        0.6845             nan     0.0100    0.0006
##    200        0.6595             nan     0.0100    0.0003
##    220        0.6374             nan     0.0100    0.0000
##    240        0.6183             nan     0.0100    0.0003
##    260        0.6004             nan     0.0100    0.0001
##    280        0.5832             nan     0.0100    0.0001
##    300        0.5660             nan     0.0100   -0.0000
##    320        0.5519             nan     0.0100   -0.0001
##    340        0.5378             nan     0.0100    0.0001
##    360        0.5252             nan     0.0100   -0.0000
##    380        0.5128             nan     0.0100    0.0000
##    400        0.5011             nan     0.0100   -0.0001
##    420        0.4898             nan     0.0100   -0.0001
##    440        0.4797             nan     0.0100   -0.0000
##    460        0.4693             nan     0.0100    0.0000
##    480        0.4590             nan     0.0100   -0.0001
##    500        0.4486             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2345             nan     0.1000    0.0394
##      2        1.1697             nan     0.1000    0.0295
##      3        1.1135             nan     0.1000    0.0238
##      4        1.0599             nan     0.1000    0.0250
##      5        1.0173             nan     0.1000    0.0202
##      6        0.9851             nan     0.1000    0.0122
##      7        0.9488             nan     0.1000    0.0130
##      8        0.9203             nan     0.1000    0.0122
##      9        0.8975             nan     0.1000    0.0073
##     10        0.8697             nan     0.1000    0.0102
##     20        0.7106             nan     0.1000    0.0014
##     40        0.5850             nan     0.1000   -0.0010
##     60        0.5100             nan     0.1000   -0.0005
##     80        0.4385             nan     0.1000    0.0015
##    100        0.3836             nan     0.1000    0.0001
##    120        0.3477             nan     0.1000   -0.0007
##    140        0.3090             nan     0.1000   -0.0007
##    160        0.2765             nan     0.1000   -0.0000
##    180        0.2488             nan     0.1000    0.0001
##    200        0.2228             nan     0.1000   -0.0005
##    220        0.2028             nan     0.1000   -0.0005
##    240        0.1860             nan     0.1000   -0.0006
##    260        0.1689             nan     0.1000   -0.0001
##    280        0.1536             nan     0.1000   -0.0004
##    300        0.1408             nan     0.1000   -0.0003
##    320        0.1276             nan     0.1000   -0.0006
##    340        0.1163             nan     0.1000   -0.0004
##    360        0.1053             nan     0.1000   -0.0005
##    380        0.0961             nan     0.1000   -0.0001
##    400        0.0878             nan     0.1000   -0.0002
##    420        0.0813             nan     0.1000   -0.0001
##    440        0.0745             nan     0.1000   -0.0001
##    460        0.0690             nan     0.1000   -0.0000
##    480        0.0641             nan     0.1000   -0.0002
##    500        0.0589             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2380             nan     0.1000    0.0384
##      2        1.1715             nan     0.1000    0.0311
##      3        1.1129             nan     0.1000    0.0250
##      4        1.0559             nan     0.1000    0.0241
##      5        1.0116             nan     0.1000    0.0165
##      6        0.9746             nan     0.1000    0.0150
##      7        0.9405             nan     0.1000    0.0130
##      8        0.9096             nan     0.1000    0.0122
##      9        0.8853             nan     0.1000    0.0095
##     10        0.8607             nan     0.1000    0.0081
##     20        0.7045             nan     0.1000    0.0032
##     40        0.5829             nan     0.1000   -0.0017
##     60        0.5000             nan     0.1000    0.0013
##     80        0.4396             nan     0.1000   -0.0015
##    100        0.3911             nan     0.1000   -0.0005
##    120        0.3465             nan     0.1000   -0.0007
##    140        0.3068             nan     0.1000   -0.0001
##    160        0.2746             nan     0.1000   -0.0012
##    180        0.2480             nan     0.1000   -0.0006
##    200        0.2248             nan     0.1000   -0.0008
##    220        0.2033             nan     0.1000    0.0001
##    240        0.1825             nan     0.1000   -0.0004
##    260        0.1661             nan     0.1000   -0.0006
##    280        0.1524             nan     0.1000   -0.0001
##    300        0.1402             nan     0.1000   -0.0004
##    320        0.1280             nan     0.1000   -0.0004
##    340        0.1176             nan     0.1000   -0.0005
##    360        0.1078             nan     0.1000   -0.0002
##    380        0.0980             nan     0.1000   -0.0002
##    400        0.0902             nan     0.1000   -0.0003
##    420        0.0837             nan     0.1000   -0.0002
##    440        0.0771             nan     0.1000   -0.0004
##    460        0.0714             nan     0.1000   -0.0002
##    480        0.0658             nan     0.1000   -0.0001
##    500        0.0608             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2515             nan     0.1000    0.0322
##      2        1.1786             nan     0.1000    0.0322
##      3        1.1205             nan     0.1000    0.0236
##      4        1.0700             nan     0.1000    0.0204
##      5        1.0232             nan     0.1000    0.0195
##      6        0.9869             nan     0.1000    0.0160
##      7        0.9537             nan     0.1000    0.0137
##      8        0.9200             nan     0.1000    0.0125
##      9        0.8963             nan     0.1000    0.0114
##     10        0.8733             nan     0.1000    0.0080
##     20        0.7202             nan     0.1000    0.0021
##     40        0.5902             nan     0.1000   -0.0005
##     60        0.5128             nan     0.1000   -0.0012
##     80        0.4548             nan     0.1000    0.0004
##    100        0.3988             nan     0.1000    0.0001
##    120        0.3571             nan     0.1000    0.0002
##    140        0.3222             nan     0.1000    0.0003
##    160        0.2921             nan     0.1000   -0.0000
##    180        0.2610             nan     0.1000   -0.0008
##    200        0.2376             nan     0.1000   -0.0013
##    220        0.2164             nan     0.1000   -0.0010
##    240        0.1966             nan     0.1000   -0.0009
##    260        0.1782             nan     0.1000   -0.0004
##    280        0.1612             nan     0.1000   -0.0007
##    300        0.1488             nan     0.1000   -0.0001
##    320        0.1361             nan     0.1000   -0.0006
##    340        0.1256             nan     0.1000   -0.0007
##    360        0.1149             nan     0.1000   -0.0001
##    380        0.1053             nan     0.1000   -0.0003
##    400        0.0971             nan     0.1000   -0.0004
##    420        0.0892             nan     0.1000   -0.0004
##    440        0.0819             nan     0.1000   -0.0004
##    460        0.0766             nan     0.1000   -0.0004
##    480        0.0707             nan     0.1000   -0.0002
##    500        0.0654             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2283             nan     0.1000    0.0425
##      2        1.1602             nan     0.1000    0.0324
##      3        1.0988             nan     0.1000    0.0244
##      4        1.0405             nan     0.1000    0.0269
##      5        0.9900             nan     0.1000    0.0153
##      6        0.9509             nan     0.1000    0.0168
##      7        0.9192             nan     0.1000    0.0110
##      8        0.8927             nan     0.1000    0.0091
##      9        0.8634             nan     0.1000    0.0122
##     10        0.8385             nan     0.1000    0.0071
##     20        0.6704             nan     0.1000    0.0038
##     40        0.5280             nan     0.1000   -0.0016
##     60        0.4374             nan     0.1000    0.0000
##     80        0.3716             nan     0.1000   -0.0009
##    100        0.3165             nan     0.1000   -0.0004
##    120        0.2692             nan     0.1000   -0.0018
##    140        0.2315             nan     0.1000   -0.0001
##    160        0.1996             nan     0.1000   -0.0006
##    180        0.1756             nan     0.1000   -0.0003
##    200        0.1527             nan     0.1000   -0.0003
##    220        0.1356             nan     0.1000   -0.0003
##    240        0.1205             nan     0.1000   -0.0000
##    260        0.1084             nan     0.1000   -0.0002
##    280        0.0972             nan     0.1000   -0.0002
##    300        0.0874             nan     0.1000   -0.0001
##    320        0.0784             nan     0.1000   -0.0002
##    340        0.0702             nan     0.1000   -0.0001
##    360        0.0632             nan     0.1000   -0.0001
##    380        0.0566             nan     0.1000   -0.0001
##    400        0.0501             nan     0.1000   -0.0001
##    420        0.0451             nan     0.1000   -0.0001
##    440        0.0408             nan     0.1000   -0.0001
##    460        0.0373             nan     0.1000   -0.0001
##    480        0.0334             nan     0.1000   -0.0000
##    500        0.0304             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2249             nan     0.1000    0.0440
##      2        1.1548             nan     0.1000    0.0302
##      3        1.0927             nan     0.1000    0.0259
##      4        1.0366             nan     0.1000    0.0212
##      5        0.9925             nan     0.1000    0.0195
##      6        0.9550             nan     0.1000    0.0171
##      7        0.9190             nan     0.1000    0.0155
##      8        0.8867             nan     0.1000    0.0116
##      9        0.8576             nan     0.1000    0.0110
##     10        0.8326             nan     0.1000    0.0090
##     20        0.6773             nan     0.1000    0.0026
##     40        0.5369             nan     0.1000   -0.0012
##     60        0.4528             nan     0.1000    0.0000
##     80        0.3841             nan     0.1000   -0.0008
##    100        0.3242             nan     0.1000    0.0004
##    120        0.2822             nan     0.1000   -0.0017
##    140        0.2462             nan     0.1000   -0.0009
##    160        0.2162             nan     0.1000   -0.0005
##    180        0.1890             nan     0.1000   -0.0004
##    200        0.1653             nan     0.1000   -0.0001
##    220        0.1458             nan     0.1000   -0.0007
##    240        0.1283             nan     0.1000    0.0001
##    260        0.1149             nan     0.1000   -0.0004
##    280        0.1028             nan     0.1000   -0.0003
##    300        0.0933             nan     0.1000   -0.0005
##    320        0.0833             nan     0.1000   -0.0003
##    340        0.0751             nan     0.1000   -0.0002
##    360        0.0663             nan     0.1000   -0.0002
##    380        0.0592             nan     0.1000   -0.0002
##    400        0.0533             nan     0.1000   -0.0001
##    420        0.0478             nan     0.1000   -0.0001
##    440        0.0439             nan     0.1000   -0.0003
##    460        0.0391             nan     0.1000   -0.0002
##    480        0.0349             nan     0.1000   -0.0001
##    500        0.0315             nan     0.1000    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2348             nan     0.1000    0.0395
##      2        1.1588             nan     0.1000    0.0318
##      3        1.0987             nan     0.1000    0.0264
##      4        1.0490             nan     0.1000    0.0215
##      5        1.0089             nan     0.1000    0.0173
##      6        0.9698             nan     0.1000    0.0162
##      7        0.9311             nan     0.1000    0.0158
##      8        0.8981             nan     0.1000    0.0137
##      9        0.8698             nan     0.1000    0.0097
##     10        0.8453             nan     0.1000    0.0102
##     20        0.6898             nan     0.1000    0.0014
##     40        0.5395             nan     0.1000    0.0004
##     60        0.4526             nan     0.1000   -0.0000
##     80        0.3807             nan     0.1000   -0.0008
##    100        0.3286             nan     0.1000   -0.0007
##    120        0.2910             nan     0.1000   -0.0001
##    140        0.2515             nan     0.1000   -0.0006
##    160        0.2217             nan     0.1000   -0.0002
##    180        0.1971             nan     0.1000   -0.0009
##    200        0.1737             nan     0.1000   -0.0007
##    220        0.1535             nan     0.1000   -0.0007
##    240        0.1359             nan     0.1000   -0.0010
##    260        0.1219             nan     0.1000   -0.0006
##    280        0.1080             nan     0.1000   -0.0003
##    300        0.0981             nan     0.1000   -0.0005
##    320        0.0890             nan     0.1000   -0.0004
##    340        0.0806             nan     0.1000   -0.0002
##    360        0.0730             nan     0.1000   -0.0001
##    380        0.0660             nan     0.1000   -0.0003
##    400        0.0596             nan     0.1000   -0.0001
##    420        0.0538             nan     0.1000   -0.0001
##    440        0.0482             nan     0.1000   -0.0001
##    460        0.0432             nan     0.1000   -0.0001
##    480        0.0389             nan     0.1000   -0.0001
##    500        0.0350             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2234             nan     0.1000    0.0478
##      2        1.1554             nan     0.1000    0.0303
##      3        1.0933             nan     0.1000    0.0298
##      4        1.0376             nan     0.1000    0.0242
##      5        0.9835             nan     0.1000    0.0229
##      6        0.9388             nan     0.1000    0.0189
##      7        0.9036             nan     0.1000    0.0141
##      8        0.8730             nan     0.1000    0.0102
##      9        0.8409             nan     0.1000    0.0132
##     10        0.8129             nan     0.1000    0.0094
##     20        0.6465             nan     0.1000    0.0030
##     40        0.4794             nan     0.1000    0.0004
##     60        0.3904             nan     0.1000   -0.0010
##     80        0.3202             nan     0.1000   -0.0004
##    100        0.2656             nan     0.1000   -0.0003
##    120        0.2211             nan     0.1000   -0.0008
##    140        0.1881             nan     0.1000   -0.0004
##    160        0.1634             nan     0.1000   -0.0003
##    180        0.1420             nan     0.1000   -0.0005
##    200        0.1209             nan     0.1000   -0.0000
##    220        0.1031             nan     0.1000   -0.0001
##    240        0.0889             nan     0.1000   -0.0000
##    260        0.0766             nan     0.1000   -0.0002
##    280        0.0677             nan     0.1000   -0.0002
##    300        0.0592             nan     0.1000   -0.0004
##    320        0.0524             nan     0.1000   -0.0001
##    340        0.0452             nan     0.1000   -0.0003
##    360        0.0399             nan     0.1000   -0.0000
##    380        0.0358             nan     0.1000   -0.0001
##    400        0.0316             nan     0.1000   -0.0001
##    420        0.0282             nan     0.1000   -0.0000
##    440        0.0249             nan     0.1000   -0.0001
##    460        0.0224             nan     0.1000   -0.0000
##    480        0.0197             nan     0.1000   -0.0001
##    500        0.0174             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2292             nan     0.1000    0.0408
##      2        1.1470             nan     0.1000    0.0381
##      3        1.0858             nan     0.1000    0.0260
##      4        1.0336             nan     0.1000    0.0230
##      5        0.9890             nan     0.1000    0.0180
##      6        0.9505             nan     0.1000    0.0156
##      7        0.9115             nan     0.1000    0.0129
##      8        0.8763             nan     0.1000    0.0139
##      9        0.8489             nan     0.1000    0.0099
##     10        0.8214             nan     0.1000    0.0110
##     20        0.6611             nan     0.1000    0.0029
##     40        0.5005             nan     0.1000   -0.0016
##     60        0.4032             nan     0.1000   -0.0013
##     80        0.3344             nan     0.1000   -0.0011
##    100        0.2879             nan     0.1000   -0.0003
##    120        0.2415             nan     0.1000   -0.0002
##    140        0.2060             nan     0.1000    0.0001
##    160        0.1745             nan     0.1000   -0.0008
##    180        0.1481             nan     0.1000   -0.0008
##    200        0.1249             nan     0.1000   -0.0000
##    220        0.1074             nan     0.1000   -0.0000
##    240        0.0925             nan     0.1000   -0.0003
##    260        0.0790             nan     0.1000   -0.0002
##    280        0.0690             nan     0.1000    0.0000
##    300        0.0596             nan     0.1000   -0.0001
##    320        0.0518             nan     0.1000   -0.0000
##    340        0.0458             nan     0.1000   -0.0003
##    360        0.0407             nan     0.1000   -0.0002
##    380        0.0353             nan     0.1000    0.0001
##    400        0.0312             nan     0.1000   -0.0001
##    420        0.0275             nan     0.1000   -0.0001
##    440        0.0236             nan     0.1000   -0.0001
##    460        0.0207             nan     0.1000   -0.0001
##    480        0.0183             nan     0.1000   -0.0001
##    500        0.0162             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2377             nan     0.1000    0.0367
##      2        1.1629             nan     0.1000    0.0351
##      3        1.1008             nan     0.1000    0.0252
##      4        1.0391             nan     0.1000    0.0281
##      5        0.9928             nan     0.1000    0.0202
##      6        0.9499             nan     0.1000    0.0188
##      7        0.9065             nan     0.1000    0.0157
##      8        0.8758             nan     0.1000    0.0106
##      9        0.8465             nan     0.1000    0.0106
##     10        0.8163             nan     0.1000    0.0122
##     20        0.6473             nan     0.1000    0.0032
##     40        0.4988             nan     0.1000    0.0005
##     60        0.4048             nan     0.1000    0.0001
##     80        0.3357             nan     0.1000   -0.0006
##    100        0.2834             nan     0.1000   -0.0001
##    120        0.2416             nan     0.1000   -0.0007
##    140        0.2043             nan     0.1000   -0.0006
##    160        0.1766             nan     0.1000   -0.0011
##    180        0.1526             nan     0.1000   -0.0007
##    200        0.1318             nan     0.1000   -0.0002
##    220        0.1145             nan     0.1000   -0.0005
##    240        0.1003             nan     0.1000   -0.0005
##    260        0.0878             nan     0.1000   -0.0006
##    280        0.0765             nan     0.1000   -0.0001
##    300        0.0675             nan     0.1000   -0.0001
##    320        0.0594             nan     0.1000   -0.0003
##    340        0.0525             nan     0.1000   -0.0001
##    360        0.0457             nan     0.1000   -0.0002
##    380        0.0404             nan     0.1000   -0.0001
##    400        0.0352             nan     0.1000   -0.0001
##    420        0.0312             nan     0.1000   -0.0002
##    440        0.0275             nan     0.1000   -0.0001
##    460        0.0243             nan     0.1000   -0.0001
##    480        0.0217             nan     0.1000   -0.0000
##    500        0.0190             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0003
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2864             nan     0.0010    0.0004
##     60        1.2699             nan     0.0010    0.0004
##     80        1.2537             nan     0.0010    0.0003
##    100        1.2381             nan     0.0010    0.0003
##    120        1.2229             nan     0.0010    0.0003
##    140        1.2087             nan     0.0010    0.0003
##    160        1.1945             nan     0.0010    0.0003
##    180        1.1808             nan     0.0010    0.0003
##    200        1.1677             nan     0.0010    0.0003
##    220        1.1546             nan     0.0010    0.0003
##    240        1.1421             nan     0.0010    0.0003
##    260        1.1299             nan     0.0010    0.0003
##    280        1.1181             nan     0.0010    0.0003
##    300        1.1069             nan     0.0010    0.0002
##    320        1.0957             nan     0.0010    0.0002
##    340        1.0849             nan     0.0010    0.0002
##    360        1.0743             nan     0.0010    0.0002
##    380        1.0639             nan     0.0010    0.0002
##    400        1.0538             nan     0.0010    0.0002
##    420        1.0441             nan     0.0010    0.0002
##    440        1.0347             nan     0.0010    0.0002
##    460        1.0256             nan     0.0010    0.0002
##    480        1.0167             nan     0.0010    0.0002
##    500        1.0080             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0003
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2863             nan     0.0010    0.0004
##     60        1.2699             nan     0.0010    0.0004
##     80        1.2539             nan     0.0010    0.0003
##    100        1.2386             nan     0.0010    0.0003
##    120        1.2236             nan     0.0010    0.0003
##    140        1.2091             nan     0.0010    0.0003
##    160        1.1951             nan     0.0010    0.0004
##    180        1.1815             nan     0.0010    0.0003
##    200        1.1683             nan     0.0010    0.0003
##    220        1.1554             nan     0.0010    0.0003
##    240        1.1427             nan     0.0010    0.0003
##    260        1.1307             nan     0.0010    0.0002
##    280        1.1191             nan     0.0010    0.0003
##    300        1.1075             nan     0.0010    0.0003
##    320        1.0967             nan     0.0010    0.0003
##    340        1.0858             nan     0.0010    0.0002
##    360        1.0752             nan     0.0010    0.0002
##    380        1.0651             nan     0.0010    0.0002
##    400        1.0552             nan     0.0010    0.0002
##    420        1.0456             nan     0.0010    0.0002
##    440        1.0362             nan     0.0010    0.0002
##    460        1.0269             nan     0.0010    0.0002
##    480        1.0181             nan     0.0010    0.0002
##    500        1.0093             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3036             nan     0.0010    0.0004
##     40        1.2864             nan     0.0010    0.0004
##     60        1.2699             nan     0.0010    0.0004
##     80        1.2536             nan     0.0010    0.0003
##    100        1.2380             nan     0.0010    0.0004
##    120        1.2231             nan     0.0010    0.0003
##    140        1.2088             nan     0.0010    0.0003
##    160        1.1948             nan     0.0010    0.0003
##    180        1.1814             nan     0.0010    0.0003
##    200        1.1686             nan     0.0010    0.0003
##    220        1.1558             nan     0.0010    0.0003
##    240        1.1436             nan     0.0010    0.0002
##    260        1.1315             nan     0.0010    0.0003
##    280        1.1199             nan     0.0010    0.0002
##    300        1.1087             nan     0.0010    0.0002
##    320        1.0976             nan     0.0010    0.0002
##    340        1.0869             nan     0.0010    0.0002
##    360        1.0765             nan     0.0010    0.0002
##    380        1.0664             nan     0.0010    0.0002
##    400        1.0567             nan     0.0010    0.0002
##    420        1.0470             nan     0.0010    0.0002
##    440        1.0379             nan     0.0010    0.0002
##    460        1.0288             nan     0.0010    0.0002
##    480        1.0202             nan     0.0010    0.0002
##    500        1.0116             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0005
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0005
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0005
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0005
##     60        1.2661             nan     0.0010    0.0004
##     80        1.2489             nan     0.0010    0.0004
##    100        1.2323             nan     0.0010    0.0004
##    120        1.2165             nan     0.0010    0.0003
##    140        1.2008             nan     0.0010    0.0003
##    160        1.1855             nan     0.0010    0.0004
##    180        1.1709             nan     0.0010    0.0004
##    200        1.1570             nan     0.0010    0.0003
##    220        1.1432             nan     0.0010    0.0003
##    240        1.1301             nan     0.0010    0.0002
##    260        1.1173             nan     0.0010    0.0003
##    280        1.1049             nan     0.0010    0.0003
##    300        1.0928             nan     0.0010    0.0003
##    320        1.0814             nan     0.0010    0.0003
##    340        1.0702             nan     0.0010    0.0002
##    360        1.0592             nan     0.0010    0.0002
##    380        1.0487             nan     0.0010    0.0002
##    400        1.0384             nan     0.0010    0.0002
##    420        1.0283             nan     0.0010    0.0002
##    440        1.0182             nan     0.0010    0.0002
##    460        1.0088             nan     0.0010    0.0002
##    480        0.9997             nan     0.0010    0.0002
##    500        0.9904             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0005
##      5        1.3163             nan     0.0010    0.0005
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0004
##     40        1.2836             nan     0.0010    0.0004
##     60        1.2660             nan     0.0010    0.0004
##     80        1.2490             nan     0.0010    0.0004
##    100        1.2323             nan     0.0010    0.0004
##    120        1.2163             nan     0.0010    0.0004
##    140        1.2012             nan     0.0010    0.0003
##    160        1.1860             nan     0.0010    0.0004
##    180        1.1713             nan     0.0010    0.0003
##    200        1.1574             nan     0.0010    0.0003
##    220        1.1439             nan     0.0010    0.0003
##    240        1.1310             nan     0.0010    0.0003
##    260        1.1182             nan     0.0010    0.0002
##    280        1.1056             nan     0.0010    0.0003
##    300        1.0934             nan     0.0010    0.0003
##    320        1.0819             nan     0.0010    0.0003
##    340        1.0707             nan     0.0010    0.0002
##    360        1.0596             nan     0.0010    0.0003
##    380        1.0490             nan     0.0010    0.0002
##    400        1.0385             nan     0.0010    0.0002
##    420        1.0283             nan     0.0010    0.0002
##    440        1.0186             nan     0.0010    0.0002
##    460        1.0090             nan     0.0010    0.0002
##    480        0.9995             nan     0.0010    0.0002
##    500        0.9901             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2843             nan     0.0010    0.0004
##     60        1.2667             nan     0.0010    0.0004
##     80        1.2498             nan     0.0010    0.0004
##    100        1.2340             nan     0.0010    0.0004
##    120        1.2179             nan     0.0010    0.0003
##    140        1.2028             nan     0.0010    0.0004
##    160        1.1880             nan     0.0010    0.0003
##    180        1.1737             nan     0.0010    0.0002
##    200        1.1598             nan     0.0010    0.0003
##    220        1.1459             nan     0.0010    0.0004
##    240        1.1328             nan     0.0010    0.0003
##    260        1.1197             nan     0.0010    0.0003
##    280        1.1074             nan     0.0010    0.0003
##    300        1.0955             nan     0.0010    0.0003
##    320        1.0837             nan     0.0010    0.0003
##    340        1.0724             nan     0.0010    0.0002
##    360        1.0614             nan     0.0010    0.0003
##    380        1.0507             nan     0.0010    0.0003
##    400        1.0403             nan     0.0010    0.0002
##    420        1.0304             nan     0.0010    0.0002
##    440        1.0207             nan     0.0010    0.0002
##    460        1.0112             nan     0.0010    0.0002
##    480        1.0019             nan     0.0010    0.0002
##    500        0.9928             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3201             nan     0.0010    0.0004
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3180             nan     0.0010    0.0005
##      4        1.3170             nan     0.0010    0.0005
##      5        1.3158             nan     0.0010    0.0005
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3138             nan     0.0010    0.0004
##      8        1.3127             nan     0.0010    0.0005
##      9        1.3117             nan     0.0010    0.0004
##     10        1.3106             nan     0.0010    0.0005
##     20        1.3009             nan     0.0010    0.0004
##     40        1.2821             nan     0.0010    0.0004
##     60        1.2636             nan     0.0010    0.0004
##     80        1.2455             nan     0.0010    0.0004
##    100        1.2283             nan     0.0010    0.0004
##    120        1.2117             nan     0.0010    0.0004
##    140        1.1956             nan     0.0010    0.0004
##    160        1.1796             nan     0.0010    0.0003
##    180        1.1644             nan     0.0010    0.0003
##    200        1.1499             nan     0.0010    0.0003
##    220        1.1357             nan     0.0010    0.0003
##    240        1.1217             nan     0.0010    0.0003
##    260        1.1085             nan     0.0010    0.0003
##    280        1.0955             nan     0.0010    0.0003
##    300        1.0831             nan     0.0010    0.0003
##    320        1.0708             nan     0.0010    0.0003
##    340        1.0591             nan     0.0010    0.0002
##    360        1.0475             nan     0.0010    0.0003
##    380        1.0366             nan     0.0010    0.0003
##    400        1.0259             nan     0.0010    0.0002
##    420        1.0152             nan     0.0010    0.0003
##    440        1.0048             nan     0.0010    0.0002
##    460        0.9949             nan     0.0010    0.0002
##    480        0.9851             nan     0.0010    0.0002
##    500        0.9757             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0005
##      6        1.3152             nan     0.0010    0.0005
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0005
##      9        1.3123             nan     0.0010    0.0005
##     10        1.3112             nan     0.0010    0.0005
##     20        1.3013             nan     0.0010    0.0005
##     40        1.2823             nan     0.0010    0.0004
##     60        1.2640             nan     0.0010    0.0004
##     80        1.2461             nan     0.0010    0.0003
##    100        1.2291             nan     0.0010    0.0004
##    120        1.2125             nan     0.0010    0.0004
##    140        1.1963             nan     0.0010    0.0004
##    160        1.1807             nan     0.0010    0.0003
##    180        1.1656             nan     0.0010    0.0003
##    200        1.1510             nan     0.0010    0.0003
##    220        1.1366             nan     0.0010    0.0003
##    240        1.1228             nan     0.0010    0.0003
##    260        1.1097             nan     0.0010    0.0003
##    280        1.0968             nan     0.0010    0.0003
##    300        1.0842             nan     0.0010    0.0003
##    320        1.0718             nan     0.0010    0.0003
##    340        1.0601             nan     0.0010    0.0003
##    360        1.0486             nan     0.0010    0.0002
##    380        1.0374             nan     0.0010    0.0002
##    400        1.0269             nan     0.0010    0.0002
##    420        1.0164             nan     0.0010    0.0002
##    440        1.0062             nan     0.0010    0.0002
##    460        0.9963             nan     0.0010    0.0002
##    480        0.9866             nan     0.0010    0.0002
##    500        0.9771             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0005
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0005
##     20        1.3015             nan     0.0010    0.0004
##     40        1.2823             nan     0.0010    0.0004
##     60        1.2639             nan     0.0010    0.0004
##     80        1.2464             nan     0.0010    0.0004
##    100        1.2296             nan     0.0010    0.0004
##    120        1.2134             nan     0.0010    0.0004
##    140        1.1977             nan     0.0010    0.0004
##    160        1.1826             nan     0.0010    0.0003
##    180        1.1678             nan     0.0010    0.0003
##    200        1.1535             nan     0.0010    0.0003
##    220        1.1394             nan     0.0010    0.0003
##    240        1.1258             nan     0.0010    0.0003
##    260        1.1125             nan     0.0010    0.0003
##    280        1.0997             nan     0.0010    0.0003
##    300        1.0873             nan     0.0010    0.0003
##    320        1.0752             nan     0.0010    0.0003
##    340        1.0632             nan     0.0010    0.0003
##    360        1.0518             nan     0.0010    0.0002
##    380        1.0406             nan     0.0010    0.0002
##    400        1.0301             nan     0.0010    0.0002
##    420        1.0198             nan     0.0010    0.0002
##    440        1.0096             nan     0.0010    0.0002
##    460        0.9999             nan     0.0010    0.0002
##    480        0.9903             nan     0.0010    0.0002
##    500        0.9808             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0043
##      2        1.3039             nan     0.0100    0.0036
##      3        1.2944             nan     0.0100    0.0039
##      4        1.2847             nan     0.0100    0.0040
##      5        1.2764             nan     0.0100    0.0039
##      6        1.2685             nan     0.0100    0.0037
##      7        1.2596             nan     0.0100    0.0039
##      8        1.2512             nan     0.0100    0.0035
##      9        1.2435             nan     0.0100    0.0037
##     10        1.2357             nan     0.0100    0.0034
##     20        1.1639             nan     0.0100    0.0030
##     40        1.0546             nan     0.0100    0.0020
##     60        0.9689             nan     0.0100    0.0012
##     80        0.9002             nan     0.0100    0.0013
##    100        0.8466             nan     0.0100    0.0010
##    120        0.8034             nan     0.0100    0.0008
##    140        0.7664             nan     0.0100    0.0005
##    160        0.7357             nan     0.0100    0.0002
##    180        0.7087             nan     0.0100    0.0002
##    200        0.6873             nan     0.0100    0.0001
##    220        0.6669             nan     0.0100    0.0000
##    240        0.6491             nan     0.0100    0.0001
##    260        0.6333             nan     0.0100    0.0001
##    280        0.6196             nan     0.0100    0.0002
##    300        0.6075             nan     0.0100   -0.0000
##    320        0.5956             nan     0.0100   -0.0000
##    340        0.5840             nan     0.0100    0.0000
##    360        0.5737             nan     0.0100   -0.0001
##    380        0.5641             nan     0.0100    0.0001
##    400        0.5549             nan     0.0100    0.0001
##    420        0.5461             nan     0.0100   -0.0001
##    440        0.5369             nan     0.0100   -0.0000
##    460        0.5285             nan     0.0100   -0.0001
##    480        0.5212             nan     0.0100   -0.0000
##    500        0.5132             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3126             nan     0.0100    0.0042
##      2        1.3035             nan     0.0100    0.0044
##      3        1.2953             nan     0.0100    0.0038
##      4        1.2862             nan     0.0100    0.0040
##      5        1.2776             nan     0.0100    0.0038
##      6        1.2698             nan     0.0100    0.0036
##      7        1.2620             nan     0.0100    0.0032
##      8        1.2535             nan     0.0100    0.0039
##      9        1.2457             nan     0.0100    0.0040
##     10        1.2384             nan     0.0100    0.0037
##     20        1.1690             nan     0.0100    0.0027
##     40        1.0570             nan     0.0100    0.0022
##     60        0.9708             nan     0.0100    0.0017
##     80        0.9043             nan     0.0100    0.0014
##    100        0.8499             nan     0.0100    0.0010
##    120        0.8059             nan     0.0100    0.0006
##    140        0.7686             nan     0.0100    0.0006
##    160        0.7380             nan     0.0100    0.0004
##    180        0.7126             nan     0.0100    0.0004
##    200        0.6909             nan     0.0100    0.0003
##    220        0.6717             nan     0.0100    0.0003
##    240        0.6551             nan     0.0100   -0.0001
##    260        0.6398             nan     0.0100    0.0002
##    280        0.6277             nan     0.0100    0.0001
##    300        0.6143             nan     0.0100   -0.0001
##    320        0.6023             nan     0.0100   -0.0001
##    340        0.5905             nan     0.0100   -0.0001
##    360        0.5800             nan     0.0100   -0.0001
##    380        0.5703             nan     0.0100   -0.0000
##    400        0.5614             nan     0.0100   -0.0001
##    420        0.5530             nan     0.0100   -0.0000
##    440        0.5443             nan     0.0100   -0.0001
##    460        0.5374             nan     0.0100   -0.0001
##    480        0.5296             nan     0.0100    0.0000
##    500        0.5219             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0044
##      2        1.3038             nan     0.0100    0.0039
##      3        1.2953             nan     0.0100    0.0041
##      4        1.2863             nan     0.0100    0.0041
##      5        1.2782             nan     0.0100    0.0037
##      6        1.2704             nan     0.0100    0.0033
##      7        1.2622             nan     0.0100    0.0036
##      8        1.2539             nan     0.0100    0.0039
##      9        1.2463             nan     0.0100    0.0033
##     10        1.2383             nan     0.0100    0.0039
##     20        1.1698             nan     0.0100    0.0029
##     40        1.0564             nan     0.0100    0.0019
##     60        0.9706             nan     0.0100    0.0016
##     80        0.9017             nan     0.0100    0.0008
##    100        0.8483             nan     0.0100    0.0008
##    120        0.8052             nan     0.0100    0.0008
##    140        0.7704             nan     0.0100    0.0003
##    160        0.7416             nan     0.0100    0.0004
##    180        0.7166             nan     0.0100    0.0004
##    200        0.6943             nan     0.0100    0.0001
##    220        0.6753             nan     0.0100    0.0000
##    240        0.6604             nan     0.0100    0.0001
##    260        0.6455             nan     0.0100    0.0000
##    280        0.6327             nan     0.0100   -0.0000
##    300        0.6201             nan     0.0100    0.0001
##    320        0.6083             nan     0.0100   -0.0001
##    340        0.5980             nan     0.0100   -0.0000
##    360        0.5884             nan     0.0100   -0.0000
##    380        0.5786             nan     0.0100    0.0000
##    400        0.5698             nan     0.0100   -0.0001
##    420        0.5610             nan     0.0100   -0.0001
##    440        0.5526             nan     0.0100   -0.0000
##    460        0.5442             nan     0.0100   -0.0000
##    480        0.5367             nan     0.0100   -0.0002
##    500        0.5297             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0041
##      2        1.3023             nan     0.0100    0.0041
##      3        1.2928             nan     0.0100    0.0041
##      4        1.2843             nan     0.0100    0.0038
##      5        1.2752             nan     0.0100    0.0043
##      6        1.2663             nan     0.0100    0.0043
##      7        1.2577             nan     0.0100    0.0038
##      8        1.2494             nan     0.0100    0.0039
##      9        1.2414             nan     0.0100    0.0035
##     10        1.2327             nan     0.0100    0.0038
##     20        1.1561             nan     0.0100    0.0033
##     40        1.0388             nan     0.0100    0.0023
##     60        0.9495             nan     0.0100    0.0018
##     80        0.8786             nan     0.0100    0.0013
##    100        0.8212             nan     0.0100    0.0010
##    120        0.7753             nan     0.0100    0.0008
##    140        0.7382             nan     0.0100    0.0004
##    160        0.7054             nan     0.0100    0.0003
##    180        0.6784             nan     0.0100    0.0003
##    200        0.6542             nan     0.0100    0.0002
##    220        0.6326             nan     0.0100    0.0001
##    240        0.6139             nan     0.0100    0.0001
##    260        0.5973             nan     0.0100    0.0001
##    280        0.5814             nan     0.0100    0.0002
##    300        0.5682             nan     0.0100   -0.0002
##    320        0.5545             nan     0.0100    0.0001
##    340        0.5429             nan     0.0100   -0.0001
##    360        0.5317             nan     0.0100    0.0001
##    380        0.5215             nan     0.0100   -0.0000
##    400        0.5116             nan     0.0100   -0.0001
##    420        0.5017             nan     0.0100    0.0000
##    440        0.4933             nan     0.0100   -0.0000
##    460        0.4837             nan     0.0100   -0.0001
##    480        0.4754             nan     0.0100   -0.0001
##    500        0.4681             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0041
##      2        1.3015             nan     0.0100    0.0044
##      3        1.2926             nan     0.0100    0.0042
##      4        1.2834             nan     0.0100    0.0043
##      5        1.2747             nan     0.0100    0.0040
##      6        1.2660             nan     0.0100    0.0037
##      7        1.2577             nan     0.0100    0.0036
##      8        1.2491             nan     0.0100    0.0038
##      9        1.2403             nan     0.0100    0.0040
##     10        1.2318             nan     0.0100    0.0037
##     20        1.1575             nan     0.0100    0.0028
##     40        1.0394             nan     0.0100    0.0023
##     60        0.9496             nan     0.0100    0.0014
##     80        0.8791             nan     0.0100    0.0014
##    100        0.8252             nan     0.0100    0.0010
##    120        0.7805             nan     0.0100    0.0005
##    140        0.7419             nan     0.0100    0.0006
##    160        0.7104             nan     0.0100    0.0006
##    180        0.6837             nan     0.0100    0.0003
##    200        0.6623             nan     0.0100    0.0002
##    220        0.6418             nan     0.0100    0.0002
##    240        0.6237             nan     0.0100    0.0000
##    260        0.6072             nan     0.0100    0.0001
##    280        0.5929             nan     0.0100    0.0001
##    300        0.5784             nan     0.0100   -0.0000
##    320        0.5648             nan     0.0100   -0.0001
##    340        0.5532             nan     0.0100   -0.0000
##    360        0.5421             nan     0.0100   -0.0001
##    380        0.5311             nan     0.0100   -0.0000
##    400        0.5215             nan     0.0100    0.0000
##    420        0.5120             nan     0.0100   -0.0000
##    440        0.5028             nan     0.0100   -0.0002
##    460        0.4941             nan     0.0100   -0.0001
##    480        0.4843             nan     0.0100   -0.0000
##    500        0.4757             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0049
##      2        1.3023             nan     0.0100    0.0046
##      3        1.2936             nan     0.0100    0.0043
##      4        1.2852             nan     0.0100    0.0038
##      5        1.2762             nan     0.0100    0.0041
##      6        1.2680             nan     0.0100    0.0036
##      7        1.2592             nan     0.0100    0.0036
##      8        1.2505             nan     0.0100    0.0041
##      9        1.2429             nan     0.0100    0.0038
##     10        1.2352             nan     0.0100    0.0038
##     20        1.1608             nan     0.0100    0.0031
##     40        1.0416             nan     0.0100    0.0023
##     60        0.9525             nan     0.0100    0.0017
##     80        0.8822             nan     0.0100    0.0013
##    100        0.8283             nan     0.0100    0.0008
##    120        0.7833             nan     0.0100    0.0006
##    140        0.7460             nan     0.0100    0.0006
##    160        0.7141             nan     0.0100    0.0003
##    180        0.6880             nan     0.0100    0.0003
##    200        0.6647             nan     0.0100    0.0003
##    220        0.6456             nan     0.0100    0.0002
##    240        0.6280             nan     0.0100    0.0003
##    260        0.6118             nan     0.0100   -0.0001
##    280        0.5972             nan     0.0100   -0.0000
##    300        0.5840             nan     0.0100    0.0001
##    320        0.5720             nan     0.0100    0.0001
##    340        0.5605             nan     0.0100    0.0001
##    360        0.5497             nan     0.0100   -0.0001
##    380        0.5390             nan     0.0100   -0.0001
##    400        0.5287             nan     0.0100   -0.0001
##    420        0.5189             nan     0.0100    0.0000
##    440        0.5101             nan     0.0100    0.0000
##    460        0.5011             nan     0.0100    0.0001
##    480        0.4929             nan     0.0100   -0.0001
##    500        0.4849             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3110             nan     0.0100    0.0048
##      2        1.3015             nan     0.0100    0.0044
##      3        1.2915             nan     0.0100    0.0047
##      4        1.2813             nan     0.0100    0.0046
##      5        1.2729             nan     0.0100    0.0037
##      6        1.2636             nan     0.0100    0.0042
##      7        1.2543             nan     0.0100    0.0041
##      8        1.2456             nan     0.0100    0.0035
##      9        1.2373             nan     0.0100    0.0041
##     10        1.2286             nan     0.0100    0.0036
##     20        1.1510             nan     0.0100    0.0036
##     40        1.0260             nan     0.0100    0.0025
##     60        0.9328             nan     0.0100    0.0018
##     80        0.8596             nan     0.0100    0.0011
##    100        0.8009             nan     0.0100    0.0012
##    120        0.7524             nan     0.0100    0.0009
##    140        0.7122             nan     0.0100    0.0006
##    160        0.6782             nan     0.0100    0.0004
##    180        0.6505             nan     0.0100    0.0003
##    200        0.6253             nan     0.0100    0.0003
##    220        0.6027             nan     0.0100    0.0001
##    240        0.5823             nan     0.0100    0.0001
##    260        0.5650             nan     0.0100    0.0001
##    280        0.5485             nan     0.0100    0.0001
##    300        0.5331             nan     0.0100    0.0000
##    320        0.5193             nan     0.0100   -0.0000
##    340        0.5078             nan     0.0100   -0.0001
##    360        0.4945             nan     0.0100    0.0000
##    380        0.4830             nan     0.0100    0.0001
##    400        0.4710             nan     0.0100    0.0000
##    420        0.4609             nan     0.0100    0.0001
##    440        0.4509             nan     0.0100    0.0000
##    460        0.4417             nan     0.0100   -0.0002
##    480        0.4323             nan     0.0100   -0.0000
##    500        0.4235             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3105             nan     0.0100    0.0048
##      2        1.3004             nan     0.0100    0.0048
##      3        1.2908             nan     0.0100    0.0044
##      4        1.2817             nan     0.0100    0.0038
##      5        1.2719             nan     0.0100    0.0044
##      6        1.2626             nan     0.0100    0.0040
##      7        1.2537             nan     0.0100    0.0038
##      8        1.2442             nan     0.0100    0.0045
##      9        1.2359             nan     0.0100    0.0034
##     10        1.2273             nan     0.0100    0.0039
##     20        1.1502             nan     0.0100    0.0033
##     40        1.0278             nan     0.0100    0.0023
##     60        0.9350             nan     0.0100    0.0016
##     80        0.8607             nan     0.0100    0.0011
##    100        0.8036             nan     0.0100    0.0007
##    120        0.7573             nan     0.0100    0.0005
##    140        0.7182             nan     0.0100    0.0004
##    160        0.6840             nan     0.0100    0.0005
##    180        0.6555             nan     0.0100    0.0003
##    200        0.6328             nan     0.0100    0.0001
##    220        0.6110             nan     0.0100    0.0002
##    240        0.5920             nan     0.0100    0.0003
##    260        0.5750             nan     0.0100   -0.0000
##    280        0.5587             nan     0.0100    0.0001
##    300        0.5443             nan     0.0100    0.0001
##    320        0.5313             nan     0.0100   -0.0000
##    340        0.5184             nan     0.0100   -0.0000
##    360        0.5062             nan     0.0100   -0.0001
##    380        0.4946             nan     0.0100   -0.0001
##    400        0.4833             nan     0.0100   -0.0001
##    420        0.4737             nan     0.0100   -0.0001
##    440        0.4632             nan     0.0100   -0.0001
##    460        0.4531             nan     0.0100    0.0000
##    480        0.4434             nan     0.0100   -0.0000
##    500        0.4344             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0046
##      2        1.3010             nan     0.0100    0.0050
##      3        1.2909             nan     0.0100    0.0049
##      4        1.2817             nan     0.0100    0.0044
##      5        1.2724             nan     0.0100    0.0045
##      6        1.2638             nan     0.0100    0.0039
##      7        1.2552             nan     0.0100    0.0040
##      8        1.2464             nan     0.0100    0.0038
##      9        1.2373             nan     0.0100    0.0039
##     10        1.2293             nan     0.0100    0.0035
##     20        1.1525             nan     0.0100    0.0029
##     40        1.0303             nan     0.0100    0.0026
##     60        0.9389             nan     0.0100    0.0018
##     80        0.8655             nan     0.0100    0.0014
##    100        0.8087             nan     0.0100    0.0008
##    120        0.7616             nan     0.0100    0.0005
##    140        0.7246             nan     0.0100    0.0005
##    160        0.6919             nan     0.0100    0.0004
##    180        0.6636             nan     0.0100    0.0002
##    200        0.6400             nan     0.0100    0.0000
##    220        0.6190             nan     0.0100    0.0000
##    240        0.6005             nan     0.0100    0.0000
##    260        0.5842             nan     0.0100    0.0002
##    280        0.5689             nan     0.0100   -0.0000
##    300        0.5543             nan     0.0100    0.0001
##    320        0.5406             nan     0.0100   -0.0000
##    340        0.5278             nan     0.0100    0.0000
##    360        0.5156             nan     0.0100   -0.0000
##    380        0.5044             nan     0.0100    0.0001
##    400        0.4935             nan     0.0100   -0.0000
##    420        0.4833             nan     0.0100    0.0001
##    440        0.4739             nan     0.0100   -0.0001
##    460        0.4645             nan     0.0100   -0.0001
##    480        0.4555             nan     0.0100   -0.0001
##    500        0.4460             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2313             nan     0.1000    0.0401
##      2        1.1619             nan     0.1000    0.0303
##      3        1.0991             nan     0.1000    0.0261
##      4        1.0463             nan     0.1000    0.0204
##      5        0.9993             nan     0.1000    0.0211
##      6        0.9610             nan     0.1000    0.0163
##      7        0.9249             nan     0.1000    0.0145
##      8        0.8918             nan     0.1000    0.0129
##      9        0.8676             nan     0.1000    0.0092
##     10        0.8430             nan     0.1000    0.0098
##     20        0.6930             nan     0.1000    0.0014
##     40        0.5634             nan     0.1000    0.0010
##     60        0.4858             nan     0.1000   -0.0007
##     80        0.4347             nan     0.1000   -0.0014
##    100        0.3855             nan     0.1000   -0.0001
##    120        0.3451             nan     0.1000   -0.0006
##    140        0.3055             nan     0.1000   -0.0008
##    160        0.2741             nan     0.1000   -0.0005
##    180        0.2461             nan     0.1000   -0.0008
##    200        0.2197             nan     0.1000   -0.0006
##    220        0.2019             nan     0.1000   -0.0005
##    240        0.1833             nan     0.1000   -0.0002
##    260        0.1677             nan     0.1000    0.0000
##    280        0.1530             nan     0.1000   -0.0003
##    300        0.1392             nan     0.1000   -0.0003
##    320        0.1257             nan     0.1000   -0.0003
##    340        0.1143             nan     0.1000   -0.0004
##    360        0.1047             nan     0.1000   -0.0002
##    380        0.0967             nan     0.1000   -0.0004
##    400        0.0882             nan     0.1000   -0.0003
##    420        0.0815             nan     0.1000    0.0000
##    440        0.0752             nan     0.1000   -0.0002
##    460        0.0695             nan     0.1000   -0.0002
##    480        0.0648             nan     0.1000   -0.0000
##    500        0.0597             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2349             nan     0.1000    0.0386
##      2        1.1649             nan     0.1000    0.0358
##      3        1.1048             nan     0.1000    0.0283
##      4        1.0486             nan     0.1000    0.0241
##      5        1.0096             nan     0.1000    0.0159
##      6        0.9680             nan     0.1000    0.0189
##      7        0.9273             nan     0.1000    0.0176
##      8        0.8950             nan     0.1000    0.0132
##      9        0.8638             nan     0.1000    0.0124
##     10        0.8386             nan     0.1000    0.0096
##     20        0.6892             nan     0.1000    0.0018
##     40        0.5648             nan     0.1000    0.0002
##     60        0.4874             nan     0.1000   -0.0011
##     80        0.4313             nan     0.1000    0.0003
##    100        0.3890             nan     0.1000   -0.0009
##    120        0.3490             nan     0.1000   -0.0004
##    140        0.3119             nan     0.1000   -0.0010
##    160        0.2829             nan     0.1000   -0.0002
##    180        0.2567             nan     0.1000   -0.0006
##    200        0.2349             nan     0.1000   -0.0007
##    220        0.2123             nan     0.1000   -0.0006
##    240        0.1913             nan     0.1000   -0.0007
##    260        0.1736             nan     0.1000   -0.0004
##    280        0.1562             nan     0.1000   -0.0003
##    300        0.1420             nan     0.1000   -0.0002
##    320        0.1313             nan     0.1000   -0.0005
##    340        0.1197             nan     0.1000   -0.0004
##    360        0.1094             nan     0.1000   -0.0004
##    380        0.1006             nan     0.1000   -0.0006
##    400        0.0932             nan     0.1000   -0.0003
##    420        0.0861             nan     0.1000   -0.0002
##    440        0.0802             nan     0.1000   -0.0004
##    460        0.0736             nan     0.1000   -0.0002
##    480        0.0672             nan     0.1000   -0.0001
##    500        0.0620             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2350             nan     0.1000    0.0416
##      2        1.1605             nan     0.1000    0.0337
##      3        1.1020             nan     0.1000    0.0249
##      4        1.0483             nan     0.1000    0.0224
##      5        1.0019             nan     0.1000    0.0205
##      6        0.9647             nan     0.1000    0.0137
##      7        0.9349             nan     0.1000    0.0114
##      8        0.9046             nan     0.1000    0.0129
##      9        0.8776             nan     0.1000    0.0115
##     10        0.8534             nan     0.1000    0.0081
##     20        0.7043             nan     0.1000   -0.0003
##     40        0.5770             nan     0.1000   -0.0001
##     60        0.5032             nan     0.1000   -0.0004
##     80        0.4435             nan     0.1000   -0.0004
##    100        0.3954             nan     0.1000   -0.0006
##    120        0.3573             nan     0.1000   -0.0019
##    140        0.3235             nan     0.1000   -0.0014
##    160        0.2912             nan     0.1000   -0.0004
##    180        0.2658             nan     0.1000   -0.0006
##    200        0.2428             nan     0.1000   -0.0003
##    220        0.2251             nan     0.1000   -0.0007
##    240        0.2013             nan     0.1000   -0.0010
##    260        0.1834             nan     0.1000   -0.0001
##    280        0.1680             nan     0.1000   -0.0005
##    300        0.1540             nan     0.1000   -0.0004
##    320        0.1423             nan     0.1000   -0.0004
##    340        0.1336             nan     0.1000   -0.0002
##    360        0.1246             nan     0.1000   -0.0003
##    380        0.1165             nan     0.1000   -0.0002
##    400        0.1073             nan     0.1000   -0.0003
##    420        0.0994             nan     0.1000   -0.0003
##    440        0.0916             nan     0.1000   -0.0003
##    460        0.0837             nan     0.1000   -0.0004
##    480        0.0780             nan     0.1000   -0.0002
##    500        0.0720             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2300             nan     0.1000    0.0410
##      2        1.1520             nan     0.1000    0.0382
##      3        1.0896             nan     0.1000    0.0251
##      4        1.0334             nan     0.1000    0.0255
##      5        0.9906             nan     0.1000    0.0166
##      6        0.9451             nan     0.1000    0.0201
##      7        0.9070             nan     0.1000    0.0138
##      8        0.8789             nan     0.1000    0.0122
##      9        0.8444             nan     0.1000    0.0130
##     10        0.8204             nan     0.1000    0.0064
##     20        0.6635             nan     0.1000    0.0041
##     40        0.5267             nan     0.1000    0.0005
##     60        0.4430             nan     0.1000   -0.0018
##     80        0.3726             nan     0.1000   -0.0002
##    100        0.3246             nan     0.1000    0.0003
##    120        0.2777             nan     0.1000   -0.0002
##    140        0.2384             nan     0.1000    0.0002
##    160        0.2064             nan     0.1000   -0.0001
##    180        0.1840             nan     0.1000   -0.0006
##    200        0.1648             nan     0.1000   -0.0002
##    220        0.1446             nan     0.1000   -0.0002
##    240        0.1270             nan     0.1000   -0.0000
##    260        0.1122             nan     0.1000   -0.0003
##    280        0.1004             nan     0.1000   -0.0002
##    300        0.0906             nan     0.1000   -0.0001
##    320        0.0813             nan     0.1000   -0.0002
##    340        0.0734             nan     0.1000   -0.0001
##    360        0.0662             nan     0.1000   -0.0002
##    380        0.0593             nan     0.1000   -0.0002
##    400        0.0534             nan     0.1000   -0.0001
##    420        0.0487             nan     0.1000   -0.0001
##    440        0.0448             nan     0.1000   -0.0001
##    460        0.0404             nan     0.1000   -0.0001
##    480        0.0374             nan     0.1000   -0.0001
##    500        0.0339             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2297             nan     0.1000    0.0406
##      2        1.1568             nan     0.1000    0.0345
##      3        1.0983             nan     0.1000    0.0278
##      4        1.0415             nan     0.1000    0.0277
##      5        0.9966             nan     0.1000    0.0193
##      6        0.9524             nan     0.1000    0.0205
##      7        0.9113             nan     0.1000    0.0160
##      8        0.8782             nan     0.1000    0.0130
##      9        0.8468             nan     0.1000    0.0125
##     10        0.8200             nan     0.1000    0.0090
##     20        0.6569             nan     0.1000    0.0033
##     40        0.5171             nan     0.1000   -0.0008
##     60        0.4437             nan     0.1000   -0.0008
##     80        0.3709             nan     0.1000    0.0005
##    100        0.3186             nan     0.1000   -0.0011
##    120        0.2819             nan     0.1000   -0.0005
##    140        0.2460             nan     0.1000   -0.0011
##    160        0.2149             nan     0.1000   -0.0005
##    180        0.1917             nan     0.1000   -0.0009
##    200        0.1700             nan     0.1000   -0.0005
##    220        0.1504             nan     0.1000   -0.0006
##    240        0.1331             nan     0.1000   -0.0004
##    260        0.1196             nan     0.1000   -0.0005
##    280        0.1061             nan     0.1000   -0.0004
##    300        0.0950             nan     0.1000   -0.0003
##    320        0.0851             nan     0.1000   -0.0001
##    340        0.0775             nan     0.1000   -0.0002
##    360        0.0695             nan     0.1000   -0.0004
##    380        0.0625             nan     0.1000   -0.0002
##    400        0.0568             nan     0.1000   -0.0002
##    420        0.0512             nan     0.1000   -0.0002
##    440        0.0463             nan     0.1000   -0.0003
##    460        0.0418             nan     0.1000   -0.0001
##    480        0.0377             nan     0.1000   -0.0001
##    500        0.0345             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2366             nan     0.1000    0.0414
##      2        1.1578             nan     0.1000    0.0373
##      3        1.0914             nan     0.1000    0.0292
##      4        1.0424             nan     0.1000    0.0222
##      5        0.9980             nan     0.1000    0.0191
##      6        0.9545             nan     0.1000    0.0185
##      7        0.9174             nan     0.1000    0.0156
##      8        0.8821             nan     0.1000    0.0147
##      9        0.8490             nan     0.1000    0.0131
##     10        0.8256             nan     0.1000    0.0095
##     20        0.6669             nan     0.1000    0.0015
##     40        0.5275             nan     0.1000    0.0001
##     60        0.4547             nan     0.1000   -0.0012
##     80        0.3877             nan     0.1000   -0.0002
##    100        0.3378             nan     0.1000   -0.0006
##    120        0.2921             nan     0.1000   -0.0002
##    140        0.2547             nan     0.1000    0.0001
##    160        0.2260             nan     0.1000   -0.0015
##    180        0.2027             nan     0.1000   -0.0001
##    200        0.1816             nan     0.1000   -0.0010
##    220        0.1607             nan     0.1000   -0.0010
##    240        0.1440             nan     0.1000   -0.0006
##    260        0.1286             nan     0.1000   -0.0001
##    280        0.1140             nan     0.1000   -0.0003
##    300        0.1025             nan     0.1000   -0.0005
##    320        0.0938             nan     0.1000   -0.0006
##    340        0.0839             nan     0.1000   -0.0004
##    360        0.0767             nan     0.1000   -0.0005
##    380        0.0696             nan     0.1000   -0.0005
##    400        0.0635             nan     0.1000   -0.0002
##    420        0.0571             nan     0.1000   -0.0003
##    440        0.0519             nan     0.1000   -0.0001
##    460        0.0469             nan     0.1000   -0.0001
##    480        0.0428             nan     0.1000   -0.0001
##    500        0.0389             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2223             nan     0.1000    0.0478
##      2        1.1443             nan     0.1000    0.0373
##      3        1.0760             nan     0.1000    0.0256
##      4        1.0172             nan     0.1000    0.0268
##      5        0.9744             nan     0.1000    0.0176
##      6        0.9348             nan     0.1000    0.0153
##      7        0.8922             nan     0.1000    0.0187
##      8        0.8599             nan     0.1000    0.0108
##      9        0.8292             nan     0.1000    0.0117
##     10        0.8009             nan     0.1000    0.0109
##     20        0.6324             nan     0.1000    0.0014
##     40        0.4825             nan     0.1000   -0.0011
##     60        0.3894             nan     0.1000   -0.0018
##     80        0.3234             nan     0.1000    0.0008
##    100        0.2707             nan     0.1000   -0.0004
##    120        0.2280             nan     0.1000   -0.0004
##    140        0.1949             nan     0.1000   -0.0005
##    160        0.1691             nan     0.1000   -0.0001
##    180        0.1436             nan     0.1000   -0.0003
##    200        0.1232             nan     0.1000   -0.0003
##    220        0.1076             nan     0.1000   -0.0003
##    240        0.0957             nan     0.1000   -0.0003
##    260        0.0825             nan     0.1000   -0.0002
##    280        0.0724             nan     0.1000   -0.0002
##    300        0.0632             nan     0.1000   -0.0002
##    320        0.0565             nan     0.1000   -0.0001
##    340        0.0492             nan     0.1000   -0.0001
##    360        0.0437             nan     0.1000   -0.0001
##    380        0.0387             nan     0.1000   -0.0001
##    400        0.0341             nan     0.1000   -0.0002
##    420        0.0302             nan     0.1000   -0.0000
##    440        0.0267             nan     0.1000   -0.0001
##    460        0.0236             nan     0.1000   -0.0001
##    480        0.0207             nan     0.1000   -0.0000
##    500        0.0184             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2207             nan     0.1000    0.0493
##      2        1.1468             nan     0.1000    0.0349
##      3        1.0759             nan     0.1000    0.0342
##      4        1.0255             nan     0.1000    0.0223
##      5        0.9775             nan     0.1000    0.0167
##      6        0.9384             nan     0.1000    0.0155
##      7        0.8989             nan     0.1000    0.0148
##      8        0.8645             nan     0.1000    0.0126
##      9        0.8369             nan     0.1000    0.0109
##     10        0.8064             nan     0.1000    0.0114
##     20        0.6389             nan     0.1000    0.0005
##     40        0.4858             nan     0.1000   -0.0003
##     60        0.4004             nan     0.1000   -0.0014
##     80        0.3318             nan     0.1000   -0.0022
##    100        0.2814             nan     0.1000   -0.0021
##    120        0.2390             nan     0.1000   -0.0003
##    140        0.2061             nan     0.1000   -0.0011
##    160        0.1737             nan     0.1000   -0.0005
##    180        0.1484             nan     0.1000   -0.0006
##    200        0.1276             nan     0.1000   -0.0005
##    220        0.1127             nan     0.1000   -0.0008
##    240        0.0996             nan     0.1000   -0.0002
##    260        0.0879             nan     0.1000   -0.0002
##    280        0.0771             nan     0.1000   -0.0003
##    300        0.0675             nan     0.1000   -0.0001
##    320        0.0597             nan     0.1000   -0.0001
##    340        0.0529             nan     0.1000   -0.0001
##    360        0.0474             nan     0.1000   -0.0002
##    380        0.0422             nan     0.1000   -0.0003
##    400        0.0370             nan     0.1000   -0.0002
##    420        0.0324             nan     0.1000   -0.0001
##    440        0.0287             nan     0.1000   -0.0001
##    460        0.0251             nan     0.1000   -0.0001
##    480        0.0221             nan     0.1000   -0.0001
##    500        0.0195             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2288             nan     0.1000    0.0451
##      2        1.1429             nan     0.1000    0.0389
##      3        1.0721             nan     0.1000    0.0314
##      4        1.0175             nan     0.1000    0.0254
##      5        0.9648             nan     0.1000    0.0212
##      6        0.9212             nan     0.1000    0.0161
##      7        0.8886             nan     0.1000    0.0127
##      8        0.8552             nan     0.1000    0.0123
##      9        0.8265             nan     0.1000    0.0108
##     10        0.8015             nan     0.1000    0.0082
##     20        0.6378             nan     0.1000    0.0035
##     40        0.4956             nan     0.1000   -0.0005
##     60        0.4046             nan     0.1000   -0.0017
##     80        0.3421             nan     0.1000   -0.0009
##    100        0.2931             nan     0.1000   -0.0008
##    120        0.2520             nan     0.1000   -0.0020
##    140        0.2161             nan     0.1000   -0.0009
##    160        0.1872             nan     0.1000   -0.0009
##    180        0.1633             nan     0.1000   -0.0003
##    200        0.1425             nan     0.1000   -0.0009
##    220        0.1228             nan     0.1000   -0.0003
##    240        0.1090             nan     0.1000   -0.0006
##    260        0.0966             nan     0.1000   -0.0005
##    280        0.0853             nan     0.1000   -0.0003
##    300        0.0751             nan     0.1000   -0.0003
##    320        0.0667             nan     0.1000   -0.0003
##    340        0.0585             nan     0.1000   -0.0003
##    360        0.0523             nan     0.1000   -0.0001
##    380        0.0463             nan     0.1000   -0.0004
##    400        0.0415             nan     0.1000   -0.0002
##    420        0.0372             nan     0.1000   -0.0001
##    440        0.0334             nan     0.1000   -0.0001
##    460        0.0294             nan     0.1000   -0.0000
##    480        0.0265             nan     0.1000   -0.0002
##    500        0.0234             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2873             nan     0.0010    0.0004
##     60        1.2714             nan     0.0010    0.0003
##     80        1.2564             nan     0.0010    0.0003
##    100        1.2424             nan     0.0010    0.0003
##    120        1.2284             nan     0.0010    0.0003
##    140        1.2148             nan     0.0010    0.0003
##    160        1.2018             nan     0.0010    0.0003
##    180        1.1889             nan     0.0010    0.0002
##    200        1.1765             nan     0.0010    0.0003
##    220        1.1643             nan     0.0010    0.0003
##    240        1.1524             nan     0.0010    0.0003
##    260        1.1409             nan     0.0010    0.0002
##    280        1.1296             nan     0.0010    0.0003
##    300        1.1191             nan     0.0010    0.0003
##    320        1.1087             nan     0.0010    0.0002
##    340        1.0985             nan     0.0010    0.0002
##    360        1.0885             nan     0.0010    0.0002
##    380        1.0791             nan     0.0010    0.0002
##    400        1.0698             nan     0.0010    0.0002
##    420        1.0606             nan     0.0010    0.0002
##    440        1.0517             nan     0.0010    0.0002
##    460        1.0430             nan     0.0010    0.0002
##    480        1.0346             nan     0.0010    0.0002
##    500        1.0263             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3191             nan     0.0010    0.0003
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0003
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3125             nan     0.0010    0.0003
##     20        1.3041             nan     0.0010    0.0004
##     40        1.2878             nan     0.0010    0.0004
##     60        1.2724             nan     0.0010    0.0003
##     80        1.2570             nan     0.0010    0.0003
##    100        1.2429             nan     0.0010    0.0003
##    120        1.2291             nan     0.0010    0.0003
##    140        1.2153             nan     0.0010    0.0003
##    160        1.2018             nan     0.0010    0.0003
##    180        1.1891             nan     0.0010    0.0003
##    200        1.1766             nan     0.0010    0.0003
##    220        1.1644             nan     0.0010    0.0003
##    240        1.1527             nan     0.0010    0.0003
##    260        1.1413             nan     0.0010    0.0003
##    280        1.1300             nan     0.0010    0.0002
##    300        1.1193             nan     0.0010    0.0002
##    320        1.1093             nan     0.0010    0.0003
##    340        1.0992             nan     0.0010    0.0002
##    360        1.0893             nan     0.0010    0.0002
##    380        1.0796             nan     0.0010    0.0002
##    400        1.0703             nan     0.0010    0.0002
##    420        1.0612             nan     0.0010    0.0002
##    440        1.0522             nan     0.0010    0.0002
##    460        1.0434             nan     0.0010    0.0002
##    480        1.0349             nan     0.0010    0.0001
##    500        1.0270             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0003
##      2        1.3191             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0003
##      8        1.3143             nan     0.0010    0.0004
##      9        1.3135             nan     0.0010    0.0003
##     10        1.3127             nan     0.0010    0.0003
##     20        1.3048             nan     0.0010    0.0003
##     40        1.2884             nan     0.0010    0.0004
##     60        1.2730             nan     0.0010    0.0003
##     80        1.2583             nan     0.0010    0.0003
##    100        1.2437             nan     0.0010    0.0003
##    120        1.2297             nan     0.0010    0.0003
##    140        1.2161             nan     0.0010    0.0003
##    160        1.2032             nan     0.0010    0.0003
##    180        1.1905             nan     0.0010    0.0003
##    200        1.1783             nan     0.0010    0.0002
##    220        1.1663             nan     0.0010    0.0003
##    240        1.1546             nan     0.0010    0.0003
##    260        1.1432             nan     0.0010    0.0003
##    280        1.1323             nan     0.0010    0.0002
##    300        1.1216             nan     0.0010    0.0002
##    320        1.1111             nan     0.0010    0.0002
##    340        1.1011             nan     0.0010    0.0002
##    360        1.0912             nan     0.0010    0.0003
##    380        1.0815             nan     0.0010    0.0002
##    400        1.0722             nan     0.0010    0.0002
##    420        1.0631             nan     0.0010    0.0001
##    440        1.0543             nan     0.0010    0.0002
##    460        1.0457             nan     0.0010    0.0002
##    480        1.0374             nan     0.0010    0.0002
##    500        1.0292             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2855             nan     0.0010    0.0004
##     60        1.2688             nan     0.0010    0.0004
##     80        1.2527             nan     0.0010    0.0004
##    100        1.2371             nan     0.0010    0.0003
##    120        1.2217             nan     0.0010    0.0003
##    140        1.2074             nan     0.0010    0.0003
##    160        1.1936             nan     0.0010    0.0003
##    180        1.1796             nan     0.0010    0.0003
##    200        1.1663             nan     0.0010    0.0003
##    220        1.1533             nan     0.0010    0.0003
##    240        1.1408             nan     0.0010    0.0002
##    260        1.1287             nan     0.0010    0.0002
##    280        1.1169             nan     0.0010    0.0002
##    300        1.1058             nan     0.0010    0.0002
##    320        1.0946             nan     0.0010    0.0002
##    340        1.0835             nan     0.0010    0.0002
##    360        1.0734             nan     0.0010    0.0002
##    380        1.0630             nan     0.0010    0.0002
##    400        1.0531             nan     0.0010    0.0002
##    420        1.0434             nan     0.0010    0.0002
##    440        1.0342             nan     0.0010    0.0002
##    460        1.0249             nan     0.0010    0.0002
##    480        1.0156             nan     0.0010    0.0002
##    500        1.0070             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0005
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2689             nan     0.0010    0.0004
##     80        1.2530             nan     0.0010    0.0004
##    100        1.2377             nan     0.0010    0.0004
##    120        1.2228             nan     0.0010    0.0003
##    140        1.2085             nan     0.0010    0.0004
##    160        1.1946             nan     0.0010    0.0002
##    180        1.1811             nan     0.0010    0.0003
##    200        1.1680             nan     0.0010    0.0002
##    220        1.1553             nan     0.0010    0.0003
##    240        1.1430             nan     0.0010    0.0002
##    260        1.1310             nan     0.0010    0.0003
##    280        1.1194             nan     0.0010    0.0002
##    300        1.1080             nan     0.0010    0.0003
##    320        1.0969             nan     0.0010    0.0002
##    340        1.0863             nan     0.0010    0.0002
##    360        1.0757             nan     0.0010    0.0002
##    380        1.0653             nan     0.0010    0.0002
##    400        1.0556             nan     0.0010    0.0002
##    420        1.0460             nan     0.0010    0.0002
##    440        1.0366             nan     0.0010    0.0002
##    460        1.0276             nan     0.0010    0.0002
##    480        1.0185             nan     0.0010    0.0002
##    500        1.0098             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0003
##     20        1.3033             nan     0.0010    0.0003
##     40        1.2866             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0003
##     80        1.2546             nan     0.0010    0.0003
##    100        1.2398             nan     0.0010    0.0003
##    120        1.2254             nan     0.0010    0.0003
##    140        1.2111             nan     0.0010    0.0003
##    160        1.1972             nan     0.0010    0.0003
##    180        1.1839             nan     0.0010    0.0003
##    200        1.1711             nan     0.0010    0.0003
##    220        1.1582             nan     0.0010    0.0003
##    240        1.1458             nan     0.0010    0.0002
##    260        1.1338             nan     0.0010    0.0002
##    280        1.1220             nan     0.0010    0.0003
##    300        1.1107             nan     0.0010    0.0002
##    320        1.0998             nan     0.0010    0.0002
##    340        1.0891             nan     0.0010    0.0002
##    360        1.0788             nan     0.0010    0.0002
##    380        1.0687             nan     0.0010    0.0002
##    400        1.0588             nan     0.0010    0.0002
##    420        1.0493             nan     0.0010    0.0002
##    440        1.0399             nan     0.0010    0.0002
##    460        1.0308             nan     0.0010    0.0002
##    480        1.0221             nan     0.0010    0.0002
##    500        1.0137             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0005
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2839             nan     0.0010    0.0004
##     60        1.2666             nan     0.0010    0.0004
##     80        1.2497             nan     0.0010    0.0004
##    100        1.2337             nan     0.0010    0.0003
##    120        1.2178             nan     0.0010    0.0003
##    140        1.2025             nan     0.0010    0.0003
##    160        1.1874             nan     0.0010    0.0003
##    180        1.1729             nan     0.0010    0.0003
##    200        1.1593             nan     0.0010    0.0003
##    220        1.1462             nan     0.0010    0.0003
##    240        1.1333             nan     0.0010    0.0003
##    260        1.1206             nan     0.0010    0.0002
##    280        1.1083             nan     0.0010    0.0002
##    300        1.0964             nan     0.0010    0.0002
##    320        1.0849             nan     0.0010    0.0002
##    340        1.0735             nan     0.0010    0.0003
##    360        1.0624             nan     0.0010    0.0002
##    380        1.0516             nan     0.0010    0.0002
##    400        1.0410             nan     0.0010    0.0003
##    420        1.0310             nan     0.0010    0.0002
##    440        1.0210             nan     0.0010    0.0002
##    460        1.0115             nan     0.0010    0.0002
##    480        1.0021             nan     0.0010    0.0002
##    500        0.9931             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3186             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0005
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0005
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0004
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2842             nan     0.0010    0.0003
##     60        1.2671             nan     0.0010    0.0004
##     80        1.2507             nan     0.0010    0.0004
##    100        1.2347             nan     0.0010    0.0003
##    120        1.2190             nan     0.0010    0.0003
##    140        1.2039             nan     0.0010    0.0003
##    160        1.1893             nan     0.0010    0.0003
##    180        1.1750             nan     0.0010    0.0003
##    200        1.1616             nan     0.0010    0.0003
##    220        1.1481             nan     0.0010    0.0002
##    240        1.1351             nan     0.0010    0.0003
##    260        1.1224             nan     0.0010    0.0003
##    280        1.1102             nan     0.0010    0.0003
##    300        1.0983             nan     0.0010    0.0002
##    320        1.0868             nan     0.0010    0.0002
##    340        1.0757             nan     0.0010    0.0002
##    360        1.0648             nan     0.0010    0.0002
##    380        1.0542             nan     0.0010    0.0003
##    400        1.0437             nan     0.0010    0.0002
##    420        1.0337             nan     0.0010    0.0002
##    440        1.0238             nan     0.0010    0.0002
##    460        1.0143             nan     0.0010    0.0002
##    480        1.0051             nan     0.0010    0.0002
##    500        0.9961             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0003
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2680             nan     0.0010    0.0004
##     80        1.2514             nan     0.0010    0.0003
##    100        1.2355             nan     0.0010    0.0004
##    120        1.2203             nan     0.0010    0.0004
##    140        1.2054             nan     0.0010    0.0003
##    160        1.1909             nan     0.0010    0.0003
##    180        1.1770             nan     0.0010    0.0003
##    200        1.1637             nan     0.0010    0.0003
##    220        1.1507             nan     0.0010    0.0003
##    240        1.1380             nan     0.0010    0.0002
##    260        1.1255             nan     0.0010    0.0002
##    280        1.1135             nan     0.0010    0.0003
##    300        1.1019             nan     0.0010    0.0003
##    320        1.0906             nan     0.0010    0.0002
##    340        1.0797             nan     0.0010    0.0003
##    360        1.0688             nan     0.0010    0.0002
##    380        1.0583             nan     0.0010    0.0002
##    400        1.0483             nan     0.0010    0.0002
##    420        1.0384             nan     0.0010    0.0002
##    440        1.0284             nan     0.0010    0.0002
##    460        1.0191             nan     0.0010    0.0002
##    480        1.0102             nan     0.0010    0.0002
##    500        1.0013             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0038
##      2        1.3043             nan     0.0100    0.0034
##      3        1.2963             nan     0.0100    0.0037
##      4        1.2875             nan     0.0100    0.0041
##      5        1.2807             nan     0.0100    0.0031
##      6        1.2736             nan     0.0100    0.0028
##      7        1.2661             nan     0.0100    0.0034
##      8        1.2580             nan     0.0100    0.0034
##      9        1.2505             nan     0.0100    0.0031
##     10        1.2426             nan     0.0100    0.0037
##     20        1.1759             nan     0.0100    0.0027
##     40        1.0692             nan     0.0100    0.0017
##     60        0.9874             nan     0.0100    0.0016
##     80        0.9246             nan     0.0100    0.0010
##    100        0.8693             nan     0.0100    0.0011
##    120        0.8263             nan     0.0100    0.0007
##    140        0.7894             nan     0.0100    0.0007
##    160        0.7580             nan     0.0100    0.0003
##    180        0.7326             nan     0.0100    0.0004
##    200        0.7114             nan     0.0100    0.0001
##    220        0.6919             nan     0.0100    0.0002
##    240        0.6736             nan     0.0100    0.0001
##    260        0.6579             nan     0.0100    0.0001
##    280        0.6432             nan     0.0100    0.0001
##    300        0.6317             nan     0.0100   -0.0000
##    320        0.6200             nan     0.0100    0.0001
##    340        0.6083             nan     0.0100   -0.0001
##    360        0.5966             nan     0.0100   -0.0000
##    380        0.5871             nan     0.0100   -0.0000
##    400        0.5775             nan     0.0100   -0.0000
##    420        0.5674             nan     0.0100   -0.0001
##    440        0.5589             nan     0.0100   -0.0001
##    460        0.5503             nan     0.0100   -0.0000
##    480        0.5414             nan     0.0100   -0.0000
##    500        0.5324             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0040
##      2        1.3037             nan     0.0100    0.0033
##      3        1.2953             nan     0.0100    0.0040
##      4        1.2874             nan     0.0100    0.0036
##      5        1.2794             nan     0.0100    0.0039
##      6        1.2716             nan     0.0100    0.0037
##      7        1.2640             nan     0.0100    0.0036
##      8        1.2567             nan     0.0100    0.0033
##      9        1.2491             nan     0.0100    0.0039
##     10        1.2423             nan     0.0100    0.0032
##     20        1.1740             nan     0.0100    0.0023
##     40        1.0686             nan     0.0100    0.0022
##     60        0.9868             nan     0.0100    0.0014
##     80        0.9221             nan     0.0100    0.0010
##    100        0.8702             nan     0.0100    0.0009
##    120        0.8272             nan     0.0100    0.0005
##    140        0.7930             nan     0.0100    0.0004
##    160        0.7651             nan     0.0100    0.0004
##    180        0.7391             nan     0.0100    0.0003
##    200        0.7164             nan     0.0100    0.0001
##    220        0.6954             nan     0.0100    0.0002
##    240        0.6778             nan     0.0100    0.0003
##    260        0.6619             nan     0.0100    0.0004
##    280        0.6488             nan     0.0100   -0.0000
##    300        0.6360             nan     0.0100   -0.0002
##    320        0.6245             nan     0.0100    0.0001
##    340        0.6131             nan     0.0100    0.0000
##    360        0.6026             nan     0.0100   -0.0001
##    380        0.5918             nan     0.0100   -0.0000
##    400        0.5817             nan     0.0100    0.0000
##    420        0.5714             nan     0.0100   -0.0000
##    440        0.5619             nan     0.0100    0.0000
##    460        0.5524             nan     0.0100    0.0001
##    480        0.5439             nan     0.0100   -0.0000
##    500        0.5355             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3132             nan     0.0100    0.0038
##      2        1.3053             nan     0.0100    0.0039
##      3        1.2965             nan     0.0100    0.0039
##      4        1.2890             nan     0.0100    0.0031
##      5        1.2813             nan     0.0100    0.0030
##      6        1.2733             nan     0.0100    0.0036
##      7        1.2654             nan     0.0100    0.0037
##      8        1.2582             nan     0.0100    0.0034
##      9        1.2506             nan     0.0100    0.0033
##     10        1.2428             nan     0.0100    0.0034
##     20        1.1791             nan     0.0100    0.0023
##     40        1.0725             nan     0.0100    0.0016
##     60        0.9909             nan     0.0100    0.0017
##     80        0.9253             nan     0.0100    0.0010
##    100        0.8734             nan     0.0100    0.0008
##    120        0.8305             nan     0.0100    0.0006
##    140        0.7957             nan     0.0100    0.0006
##    160        0.7661             nan     0.0100    0.0004
##    180        0.7418             nan     0.0100    0.0001
##    200        0.7204             nan     0.0100    0.0001
##    220        0.7011             nan     0.0100    0.0001
##    240        0.6841             nan     0.0100    0.0002
##    260        0.6681             nan     0.0100    0.0002
##    280        0.6549             nan     0.0100    0.0001
##    300        0.6420             nan     0.0100    0.0001
##    320        0.6295             nan     0.0100    0.0001
##    340        0.6187             nan     0.0100    0.0000
##    360        0.6070             nan     0.0100    0.0000
##    380        0.5969             nan     0.0100   -0.0000
##    400        0.5875             nan     0.0100   -0.0000
##    420        0.5779             nan     0.0100   -0.0000
##    440        0.5686             nan     0.0100    0.0000
##    460        0.5592             nan     0.0100   -0.0001
##    480        0.5509             nan     0.0100   -0.0001
##    500        0.5421             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0045
##      2        1.3021             nan     0.0100    0.0042
##      3        1.2929             nan     0.0100    0.0042
##      4        1.2841             nan     0.0100    0.0037
##      5        1.2762             nan     0.0100    0.0039
##      6        1.2680             nan     0.0100    0.0037
##      7        1.2605             nan     0.0100    0.0031
##      8        1.2519             nan     0.0100    0.0036
##      9        1.2443             nan     0.0100    0.0033
##     10        1.2370             nan     0.0100    0.0032
##     20        1.1662             nan     0.0100    0.0031
##     40        1.0536             nan     0.0100    0.0017
##     60        0.9664             nan     0.0100    0.0018
##     80        0.8991             nan     0.0100    0.0013
##    100        0.8440             nan     0.0100    0.0010
##    120        0.7983             nan     0.0100    0.0008
##    140        0.7608             nan     0.0100    0.0005
##    160        0.7283             nan     0.0100    0.0002
##    180        0.7020             nan     0.0100    0.0002
##    200        0.6778             nan     0.0100    0.0001
##    220        0.6560             nan     0.0100    0.0003
##    240        0.6354             nan     0.0100    0.0001
##    260        0.6178             nan     0.0100    0.0002
##    280        0.6020             nan     0.0100    0.0000
##    300        0.5867             nan     0.0100    0.0001
##    320        0.5737             nan     0.0100    0.0001
##    340        0.5613             nan     0.0100    0.0001
##    360        0.5488             nan     0.0100    0.0002
##    380        0.5371             nan     0.0100   -0.0001
##    400        0.5263             nan     0.0100   -0.0000
##    420        0.5158             nan     0.0100   -0.0001
##    440        0.5058             nan     0.0100   -0.0000
##    460        0.4952             nan     0.0100   -0.0000
##    480        0.4858             nan     0.0100   -0.0002
##    500        0.4769             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0042
##      2        1.3034             nan     0.0100    0.0044
##      3        1.2946             nan     0.0100    0.0039
##      4        1.2858             nan     0.0100    0.0039
##      5        1.2773             nan     0.0100    0.0039
##      6        1.2687             nan     0.0100    0.0035
##      7        1.2601             nan     0.0100    0.0040
##      8        1.2519             nan     0.0100    0.0038
##      9        1.2444             nan     0.0100    0.0034
##     10        1.2362             nan     0.0100    0.0036
##     20        1.1650             nan     0.0100    0.0031
##     40        1.0547             nan     0.0100    0.0018
##     60        0.9698             nan     0.0100    0.0016
##     80        0.9000             nan     0.0100    0.0015
##    100        0.8445             nan     0.0100    0.0010
##    120        0.7998             nan     0.0100    0.0008
##    140        0.7636             nan     0.0100    0.0005
##    160        0.7325             nan     0.0100    0.0006
##    180        0.7072             nan     0.0100    0.0003
##    200        0.6821             nan     0.0100    0.0002
##    220        0.6614             nan     0.0100    0.0004
##    240        0.6425             nan     0.0100    0.0002
##    260        0.6257             nan     0.0100    0.0002
##    280        0.6109             nan     0.0100   -0.0000
##    300        0.5961             nan     0.0100    0.0001
##    320        0.5829             nan     0.0100   -0.0000
##    340        0.5701             nan     0.0100   -0.0003
##    360        0.5584             nan     0.0100   -0.0001
##    380        0.5468             nan     0.0100   -0.0000
##    400        0.5372             nan     0.0100   -0.0000
##    420        0.5265             nan     0.0100    0.0001
##    440        0.5164             nan     0.0100    0.0002
##    460        0.5074             nan     0.0100   -0.0001
##    480        0.4972             nan     0.0100    0.0002
##    500        0.4878             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0038
##      2        1.3030             nan     0.0100    0.0042
##      3        1.2951             nan     0.0100    0.0036
##      4        1.2869             nan     0.0100    0.0041
##      5        1.2782             nan     0.0100    0.0040
##      6        1.2703             nan     0.0100    0.0034
##      7        1.2624             nan     0.0100    0.0034
##      8        1.2541             nan     0.0100    0.0035
##      9        1.2467             nan     0.0100    0.0029
##     10        1.2394             nan     0.0100    0.0031
##     20        1.1696             nan     0.0100    0.0027
##     40        1.0609             nan     0.0100    0.0017
##     60        0.9767             nan     0.0100    0.0013
##     80        0.9090             nan     0.0100    0.0012
##    100        0.8560             nan     0.0100    0.0009
##    120        0.8103             nan     0.0100    0.0007
##    140        0.7744             nan     0.0100    0.0007
##    160        0.7435             nan     0.0100    0.0003
##    180        0.7164             nan     0.0100    0.0004
##    200        0.6923             nan     0.0100    0.0004
##    220        0.6718             nan     0.0100    0.0002
##    240        0.6528             nan     0.0100    0.0002
##    260        0.6346             nan     0.0100    0.0000
##    280        0.6193             nan     0.0100    0.0001
##    300        0.6043             nan     0.0100    0.0001
##    320        0.5906             nan     0.0100   -0.0001
##    340        0.5778             nan     0.0100   -0.0000
##    360        0.5657             nan     0.0100   -0.0001
##    380        0.5542             nan     0.0100   -0.0000
##    400        0.5433             nan     0.0100   -0.0001
##    420        0.5332             nan     0.0100   -0.0000
##    440        0.5230             nan     0.0100   -0.0001
##    460        0.5137             nan     0.0100    0.0001
##    480        0.5036             nan     0.0100    0.0000
##    500        0.4948             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3107             nan     0.0100    0.0045
##      2        1.3009             nan     0.0100    0.0042
##      3        1.2905             nan     0.0100    0.0043
##      4        1.2822             nan     0.0100    0.0037
##      5        1.2722             nan     0.0100    0.0039
##      6        1.2637             nan     0.0100    0.0042
##      7        1.2560             nan     0.0100    0.0033
##      8        1.2481             nan     0.0100    0.0036
##      9        1.2403             nan     0.0100    0.0034
##     10        1.2320             nan     0.0100    0.0034
##     20        1.1556             nan     0.0100    0.0031
##     40        1.0387             nan     0.0100    0.0024
##     60        0.9495             nan     0.0100    0.0015
##     80        0.8786             nan     0.0100    0.0013
##    100        0.8210             nan     0.0100    0.0012
##    120        0.7745             nan     0.0100    0.0007
##    140        0.7346             nan     0.0100    0.0006
##    160        0.7014             nan     0.0100    0.0004
##    180        0.6738             nan     0.0100    0.0004
##    200        0.6475             nan     0.0100    0.0002
##    220        0.6242             nan     0.0100    0.0001
##    240        0.6033             nan     0.0100   -0.0000
##    260        0.5851             nan     0.0100    0.0001
##    280        0.5677             nan     0.0100    0.0001
##    300        0.5518             nan     0.0100    0.0001
##    320        0.5369             nan     0.0100    0.0001
##    340        0.5228             nan     0.0100    0.0000
##    360        0.5093             nan     0.0100    0.0002
##    380        0.4953             nan     0.0100    0.0001
##    400        0.4841             nan     0.0100    0.0001
##    420        0.4730             nan     0.0100    0.0001
##    440        0.4624             nan     0.0100   -0.0002
##    460        0.4527             nan     0.0100   -0.0001
##    480        0.4426             nan     0.0100   -0.0000
##    500        0.4327             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3108             nan     0.0100    0.0043
##      2        1.3012             nan     0.0100    0.0044
##      3        1.2927             nan     0.0100    0.0038
##      4        1.2842             nan     0.0100    0.0043
##      5        1.2754             nan     0.0100    0.0042
##      6        1.2666             nan     0.0100    0.0039
##      7        1.2583             nan     0.0100    0.0034
##      8        1.2500             nan     0.0100    0.0039
##      9        1.2413             nan     0.0100    0.0038
##     10        1.2333             nan     0.0100    0.0036
##     20        1.1599             nan     0.0100    0.0027
##     40        1.0445             nan     0.0100    0.0023
##     60        0.9554             nan     0.0100    0.0016
##     80        0.8860             nan     0.0100    0.0014
##    100        0.8306             nan     0.0100    0.0008
##    120        0.7837             nan     0.0100    0.0006
##    140        0.7445             nan     0.0100    0.0008
##    160        0.7111             nan     0.0100    0.0004
##    180        0.6822             nan     0.0100    0.0004
##    200        0.6573             nan     0.0100    0.0003
##    220        0.6342             nan     0.0100    0.0000
##    240        0.6136             nan     0.0100   -0.0000
##    260        0.5950             nan     0.0100    0.0001
##    280        0.5793             nan     0.0100   -0.0000
##    300        0.5630             nan     0.0100   -0.0001
##    320        0.5483             nan     0.0100   -0.0001
##    340        0.5339             nan     0.0100    0.0001
##    360        0.5206             nan     0.0100    0.0001
##    380        0.5073             nan     0.0100   -0.0000
##    400        0.4956             nan     0.0100    0.0001
##    420        0.4836             nan     0.0100    0.0001
##    440        0.4724             nan     0.0100    0.0000
##    460        0.4615             nan     0.0100    0.0000
##    480        0.4524             nan     0.0100   -0.0002
##    500        0.4438             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0040
##      2        1.3027             nan     0.0100    0.0036
##      3        1.2934             nan     0.0100    0.0042
##      4        1.2847             nan     0.0100    0.0037
##      5        1.2760             nan     0.0100    0.0042
##      6        1.2673             nan     0.0100    0.0035
##      7        1.2592             nan     0.0100    0.0035
##      8        1.2509             nan     0.0100    0.0039
##      9        1.2430             nan     0.0100    0.0036
##     10        1.2346             nan     0.0100    0.0035
##     20        1.1624             nan     0.0100    0.0023
##     40        1.0485             nan     0.0100    0.0020
##     60        0.9606             nan     0.0100    0.0018
##     80        0.8892             nan     0.0100    0.0011
##    100        0.8334             nan     0.0100    0.0009
##    120        0.7898             nan     0.0100    0.0005
##    140        0.7508             nan     0.0100    0.0005
##    160        0.7183             nan     0.0100    0.0006
##    180        0.6911             nan     0.0100    0.0002
##    200        0.6658             nan     0.0100    0.0002
##    220        0.6433             nan     0.0100    0.0002
##    240        0.6222             nan     0.0100    0.0000
##    260        0.6032             nan     0.0100   -0.0001
##    280        0.5865             nan     0.0100   -0.0000
##    300        0.5701             nan     0.0100    0.0001
##    320        0.5549             nan     0.0100    0.0000
##    340        0.5417             nan     0.0100   -0.0002
##    360        0.5286             nan     0.0100    0.0001
##    380        0.5171             nan     0.0100    0.0000
##    400        0.5054             nan     0.0100   -0.0000
##    420        0.4941             nan     0.0100   -0.0002
##    440        0.4838             nan     0.0100   -0.0001
##    460        0.4732             nan     0.0100    0.0001
##    480        0.4629             nan     0.0100   -0.0001
##    500        0.4526             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2447             nan     0.1000    0.0344
##      2        1.1709             nan     0.1000    0.0304
##      3        1.1140             nan     0.1000    0.0274
##      4        1.0649             nan     0.1000    0.0200
##      5        1.0277             nan     0.1000    0.0168
##      6        0.9906             nan     0.1000    0.0150
##      7        0.9513             nan     0.1000    0.0145
##      8        0.9248             nan     0.1000    0.0106
##      9        0.8956             nan     0.1000    0.0122
##     10        0.8747             nan     0.1000    0.0066
##     20        0.7203             nan     0.1000    0.0031
##     40        0.5863             nan     0.1000   -0.0004
##     60        0.5004             nan     0.1000    0.0009
##     80        0.4303             nan     0.1000   -0.0007
##    100        0.3850             nan     0.1000   -0.0003
##    120        0.3405             nan     0.1000   -0.0003
##    140        0.3035             nan     0.1000   -0.0006
##    160        0.2707             nan     0.1000   -0.0003
##    180        0.2432             nan     0.1000   -0.0005
##    200        0.2195             nan     0.1000    0.0001
##    220        0.1980             nan     0.1000   -0.0008
##    240        0.1773             nan     0.1000   -0.0007
##    260        0.1621             nan     0.1000   -0.0005
##    280        0.1473             nan     0.1000   -0.0005
##    300        0.1329             nan     0.1000   -0.0000
##    320        0.1208             nan     0.1000    0.0003
##    340        0.1102             nan     0.1000   -0.0001
##    360        0.1007             nan     0.1000   -0.0004
##    380        0.0927             nan     0.1000   -0.0001
##    400        0.0843             nan     0.1000   -0.0002
##    420        0.0777             nan     0.1000   -0.0003
##    440        0.0715             nan     0.1000   -0.0002
##    460        0.0647             nan     0.1000   -0.0002
##    480        0.0591             nan     0.1000   -0.0001
##    500        0.0544             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2460             nan     0.1000    0.0332
##      2        1.1701             nan     0.1000    0.0333
##      3        1.1088             nan     0.1000    0.0292
##      4        1.0586             nan     0.1000    0.0215
##      5        1.0169             nan     0.1000    0.0173
##      6        0.9822             nan     0.1000    0.0156
##      7        0.9517             nan     0.1000    0.0129
##      8        0.9211             nan     0.1000    0.0117
##      9        0.8918             nan     0.1000    0.0135
##     10        0.8710             nan     0.1000    0.0070
##     20        0.7252             nan     0.1000    0.0004
##     40        0.5905             nan     0.1000   -0.0011
##     60        0.5149             nan     0.1000   -0.0009
##     80        0.4447             nan     0.1000    0.0000
##    100        0.3942             nan     0.1000   -0.0018
##    120        0.3556             nan     0.1000   -0.0011
##    140        0.3178             nan     0.1000    0.0005
##    160        0.2871             nan     0.1000   -0.0004
##    180        0.2540             nan     0.1000   -0.0007
##    200        0.2316             nan     0.1000   -0.0008
##    220        0.2095             nan     0.1000   -0.0005
##    240        0.1890             nan     0.1000   -0.0001
##    260        0.1707             nan     0.1000   -0.0003
##    280        0.1542             nan     0.1000   -0.0001
##    300        0.1404             nan     0.1000   -0.0003
##    320        0.1283             nan     0.1000   -0.0005
##    340        0.1170             nan     0.1000   -0.0003
##    360        0.1075             nan     0.1000   -0.0002
##    380        0.0988             nan     0.1000   -0.0003
##    400        0.0905             nan     0.1000   -0.0003
##    420        0.0836             nan     0.1000   -0.0002
##    440        0.0766             nan     0.1000    0.0001
##    460        0.0707             nan     0.1000    0.0000
##    480        0.0658             nan     0.1000   -0.0003
##    500        0.0605             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2390             nan     0.1000    0.0361
##      2        1.1672             nan     0.1000    0.0327
##      3        1.1118             nan     0.1000    0.0198
##      4        1.0591             nan     0.1000    0.0213
##      5        1.0184             nan     0.1000    0.0160
##      6        0.9817             nan     0.1000    0.0178
##      7        0.9459             nan     0.1000    0.0131
##      8        0.9151             nan     0.1000    0.0142
##      9        0.8890             nan     0.1000    0.0112
##     10        0.8648             nan     0.1000    0.0106
##     20        0.7173             nan     0.1000    0.0007
##     40        0.5817             nan     0.1000   -0.0001
##     60        0.4991             nan     0.1000   -0.0006
##     80        0.4376             nan     0.1000   -0.0006
##    100        0.3834             nan     0.1000    0.0002
##    120        0.3418             nan     0.1000    0.0002
##    140        0.3103             nan     0.1000   -0.0014
##    160        0.2820             nan     0.1000   -0.0009
##    180        0.2514             nan     0.1000   -0.0003
##    200        0.2267             nan     0.1000   -0.0002
##    220        0.2052             nan     0.1000   -0.0006
##    240        0.1848             nan     0.1000   -0.0000
##    260        0.1680             nan     0.1000   -0.0003
##    280        0.1523             nan     0.1000   -0.0008
##    300        0.1379             nan     0.1000   -0.0004
##    320        0.1253             nan     0.1000   -0.0004
##    340        0.1150             nan     0.1000   -0.0003
##    360        0.1045             nan     0.1000   -0.0003
##    380        0.0962             nan     0.1000   -0.0003
##    400        0.0890             nan     0.1000   -0.0001
##    420        0.0821             nan     0.1000   -0.0001
##    440        0.0756             nan     0.1000   -0.0003
##    460        0.0694             nan     0.1000   -0.0001
##    480        0.0636             nan     0.1000   -0.0002
##    500        0.0595             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2329             nan     0.1000    0.0408
##      2        1.1597             nan     0.1000    0.0315
##      3        1.0987             nan     0.1000    0.0278
##      4        1.0450             nan     0.1000    0.0214
##      5        1.0017             nan     0.1000    0.0177
##      6        0.9608             nan     0.1000    0.0154
##      7        0.9265             nan     0.1000    0.0121
##      8        0.8977             nan     0.1000    0.0086
##      9        0.8666             nan     0.1000    0.0122
##     10        0.8408             nan     0.1000    0.0113
##     20        0.6877             nan     0.1000    0.0009
##     40        0.5405             nan     0.1000    0.0007
##     60        0.4448             nan     0.1000   -0.0004
##     80        0.3753             nan     0.1000   -0.0005
##    100        0.3214             nan     0.1000   -0.0012
##    120        0.2741             nan     0.1000    0.0003
##    140        0.2380             nan     0.1000    0.0001
##    160        0.2046             nan     0.1000   -0.0004
##    180        0.1775             nan     0.1000   -0.0003
##    200        0.1553             nan     0.1000   -0.0000
##    220        0.1353             nan     0.1000   -0.0002
##    240        0.1210             nan     0.1000   -0.0002
##    260        0.1057             nan     0.1000   -0.0005
##    280        0.0947             nan     0.1000   -0.0002
##    300        0.0843             nan     0.1000   -0.0001
##    320        0.0756             nan     0.1000   -0.0002
##    340        0.0681             nan     0.1000   -0.0003
##    360        0.0609             nan     0.1000   -0.0000
##    380        0.0547             nan     0.1000   -0.0001
##    400        0.0495             nan     0.1000   -0.0001
##    420        0.0447             nan     0.1000   -0.0000
##    440        0.0409             nan     0.1000   -0.0001
##    460        0.0367             nan     0.1000   -0.0000
##    480        0.0331             nan     0.1000   -0.0001
##    500        0.0303             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2349             nan     0.1000    0.0394
##      2        1.1647             nan     0.1000    0.0297
##      3        1.1021             nan     0.1000    0.0290
##      4        1.0490             nan     0.1000    0.0249
##      5        1.0037             nan     0.1000    0.0196
##      6        0.9635             nan     0.1000    0.0151
##      7        0.9272             nan     0.1000    0.0135
##      8        0.8959             nan     0.1000    0.0123
##      9        0.8694             nan     0.1000    0.0111
##     10        0.8446             nan     0.1000    0.0089
##     20        0.6826             nan     0.1000    0.0016
##     40        0.5282             nan     0.1000   -0.0014
##     60        0.4499             nan     0.1000   -0.0013
##     80        0.3851             nan     0.1000   -0.0002
##    100        0.3223             nan     0.1000   -0.0000
##    120        0.2778             nan     0.1000    0.0001
##    140        0.2415             nan     0.1000   -0.0006
##    160        0.2113             nan     0.1000   -0.0006
##    180        0.1882             nan     0.1000   -0.0008
##    200        0.1636             nan     0.1000   -0.0004
##    220        0.1440             nan     0.1000   -0.0002
##    240        0.1279             nan     0.1000   -0.0008
##    260        0.1138             nan     0.1000   -0.0003
##    280        0.1007             nan     0.1000   -0.0001
##    300        0.0889             nan     0.1000   -0.0002
##    320        0.0799             nan     0.1000   -0.0001
##    340        0.0711             nan     0.1000   -0.0004
##    360        0.0637             nan     0.1000   -0.0002
##    380        0.0575             nan     0.1000   -0.0002
##    400        0.0513             nan     0.1000   -0.0001
##    420        0.0456             nan     0.1000   -0.0001
##    440        0.0409             nan     0.1000   -0.0001
##    460        0.0361             nan     0.1000   -0.0002
##    480        0.0325             nan     0.1000   -0.0001
##    500        0.0294             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2362             nan     0.1000    0.0346
##      2        1.1721             nan     0.1000    0.0253
##      3        1.1088             nan     0.1000    0.0282
##      4        1.0600             nan     0.1000    0.0185
##      5        1.0193             nan     0.1000    0.0160
##      6        0.9743             nan     0.1000    0.0195
##      7        0.9419             nan     0.1000    0.0129
##      8        0.9090             nan     0.1000    0.0108
##      9        0.8768             nan     0.1000    0.0127
##     10        0.8515             nan     0.1000    0.0094
##     20        0.6863             nan     0.1000    0.0011
##     40        0.5479             nan     0.1000    0.0000
##     60        0.4605             nan     0.1000   -0.0006
##     80        0.3956             nan     0.1000   -0.0017
##    100        0.3372             nan     0.1000   -0.0006
##    120        0.2905             nan     0.1000   -0.0009
##    140        0.2571             nan     0.1000   -0.0007
##    160        0.2269             nan     0.1000   -0.0006
##    180        0.2005             nan     0.1000   -0.0004
##    200        0.1803             nan     0.1000   -0.0004
##    220        0.1613             nan     0.1000   -0.0001
##    240        0.1436             nan     0.1000   -0.0006
##    260        0.1273             nan     0.1000   -0.0004
##    280        0.1127             nan     0.1000   -0.0004
##    300        0.1007             nan     0.1000   -0.0002
##    320        0.0901             nan     0.1000   -0.0004
##    340        0.0812             nan     0.1000   -0.0004
##    360        0.0734             nan     0.1000   -0.0003
##    380        0.0657             nan     0.1000   -0.0003
##    400        0.0591             nan     0.1000   -0.0002
##    420        0.0524             nan     0.1000   -0.0001
##    440        0.0477             nan     0.1000   -0.0001
##    460        0.0430             nan     0.1000   -0.0001
##    480        0.0386             nan     0.1000   -0.0001
##    500        0.0347             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2446             nan     0.1000    0.0313
##      2        1.1666             nan     0.1000    0.0366
##      3        1.1058             nan     0.1000    0.0253
##      4        1.0503             nan     0.1000    0.0184
##      5        0.9984             nan     0.1000    0.0225
##      6        0.9610             nan     0.1000    0.0163
##      7        0.9248             nan     0.1000    0.0142
##      8        0.8937             nan     0.1000    0.0121
##      9        0.8664             nan     0.1000    0.0082
##     10        0.8373             nan     0.1000    0.0134
##     20        0.6630             nan     0.1000    0.0017
##     40        0.4969             nan     0.1000    0.0015
##     60        0.3977             nan     0.1000    0.0007
##     80        0.3258             nan     0.1000   -0.0012
##    100        0.2724             nan     0.1000    0.0001
##    120        0.2302             nan     0.1000   -0.0005
##    140        0.1967             nan     0.1000   -0.0007
##    160        0.1673             nan     0.1000   -0.0004
##    180        0.1435             nan     0.1000   -0.0003
##    200        0.1211             nan     0.1000   -0.0003
##    220        0.1044             nan     0.1000   -0.0001
##    240        0.0911             nan     0.1000   -0.0001
##    260        0.0785             nan     0.1000   -0.0000
##    280        0.0701             nan     0.1000   -0.0003
##    300        0.0604             nan     0.1000   -0.0001
##    320        0.0520             nan     0.1000   -0.0000
##    340        0.0450             nan     0.1000   -0.0001
##    360        0.0385             nan     0.1000   -0.0001
##    380        0.0340             nan     0.1000   -0.0001
##    400        0.0303             nan     0.1000   -0.0001
##    420        0.0268             nan     0.1000   -0.0000
##    440        0.0235             nan     0.1000    0.0000
##    460        0.0210             nan     0.1000   -0.0000
##    480        0.0187             nan     0.1000   -0.0001
##    500        0.0166             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2329             nan     0.1000    0.0376
##      2        1.1580             nan     0.1000    0.0343
##      3        1.0980             nan     0.1000    0.0255
##      4        1.0421             nan     0.1000    0.0262
##      5        0.9911             nan     0.1000    0.0217
##      6        0.9471             nan     0.1000    0.0189
##      7        0.9079             nan     0.1000    0.0174
##      8        0.8763             nan     0.1000    0.0116
##      9        0.8459             nan     0.1000    0.0112
##     10        0.8177             nan     0.1000    0.0092
##     20        0.6492             nan     0.1000    0.0028
##     40        0.4992             nan     0.1000   -0.0002
##     60        0.4036             nan     0.1000   -0.0012
##     80        0.3344             nan     0.1000   -0.0012
##    100        0.2751             nan     0.1000   -0.0009
##    120        0.2263             nan     0.1000   -0.0008
##    140        0.1943             nan     0.1000   -0.0002
##    160        0.1646             nan     0.1000   -0.0008
##    180        0.1420             nan     0.1000   -0.0005
##    200        0.1216             nan     0.1000    0.0001
##    220        0.1046             nan     0.1000   -0.0002
##    240        0.0907             nan     0.1000   -0.0004
##    260        0.0790             nan     0.1000   -0.0003
##    280        0.0693             nan     0.1000   -0.0002
##    300        0.0601             nan     0.1000   -0.0002
##    320        0.0526             nan     0.1000   -0.0002
##    340        0.0463             nan     0.1000   -0.0002
##    360        0.0408             nan     0.1000   -0.0003
##    380        0.0357             nan     0.1000   -0.0001
##    400        0.0316             nan     0.1000   -0.0001
##    420        0.0273             nan     0.1000   -0.0000
##    440        0.0239             nan     0.1000   -0.0001
##    460        0.0207             nan     0.1000   -0.0001
##    480        0.0185             nan     0.1000   -0.0000
##    500        0.0160             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2304             nan     0.1000    0.0442
##      2        1.1536             nan     0.1000    0.0362
##      3        1.0943             nan     0.1000    0.0225
##      4        1.0438             nan     0.1000    0.0247
##      5        0.9978             nan     0.1000    0.0213
##      6        0.9547             nan     0.1000    0.0164
##      7        0.9221             nan     0.1000    0.0144
##      8        0.8887             nan     0.1000    0.0135
##      9        0.8562             nan     0.1000    0.0139
##     10        0.8274             nan     0.1000    0.0112
##     20        0.6546             nan     0.1000    0.0021
##     40        0.5011             nan     0.1000    0.0011
##     60        0.4108             nan     0.1000   -0.0003
##     80        0.3401             nan     0.1000   -0.0002
##    100        0.2841             nan     0.1000   -0.0005
##    120        0.2397             nan     0.1000   -0.0007
##    140        0.2001             nan     0.1000   -0.0003
##    160        0.1720             nan     0.1000   -0.0005
##    180        0.1511             nan     0.1000   -0.0008
##    200        0.1296             nan     0.1000   -0.0006
##    220        0.1114             nan     0.1000   -0.0004
##    240        0.0960             nan     0.1000   -0.0000
##    260        0.0826             nan     0.1000   -0.0002
##    280        0.0710             nan     0.1000   -0.0003
##    300        0.0622             nan     0.1000   -0.0004
##    320        0.0546             nan     0.1000   -0.0001
##    340        0.0480             nan     0.1000   -0.0001
##    360        0.0422             nan     0.1000   -0.0002
##    380        0.0377             nan     0.1000   -0.0001
##    400        0.0330             nan     0.1000   -0.0001
##    420        0.0293             nan     0.1000   -0.0001
##    440        0.0256             nan     0.1000   -0.0001
##    460        0.0227             nan     0.1000   -0.0001
##    480        0.0202             nan     0.1000   -0.0000
##    500        0.0177             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2875             nan     0.0010    0.0003
##     60        1.2717             nan     0.0010    0.0003
##     80        1.2564             nan     0.0010    0.0003
##    100        1.2414             nan     0.0010    0.0003
##    120        1.2271             nan     0.0010    0.0003
##    140        1.2134             nan     0.0010    0.0003
##    160        1.2000             nan     0.0010    0.0003
##    180        1.1870             nan     0.0010    0.0003
##    200        1.1741             nan     0.0010    0.0002
##    220        1.1619             nan     0.0010    0.0002
##    240        1.1500             nan     0.0010    0.0002
##    260        1.1385             nan     0.0010    0.0003
##    280        1.1271             nan     0.0010    0.0003
##    300        1.1164             nan     0.0010    0.0002
##    320        1.1058             nan     0.0010    0.0002
##    340        1.0956             nan     0.0010    0.0002
##    360        1.0857             nan     0.0010    0.0002
##    380        1.0759             nan     0.0010    0.0002
##    400        1.0664             nan     0.0010    0.0002
##    420        1.0572             nan     0.0010    0.0002
##    440        1.0482             nan     0.0010    0.0002
##    460        1.0392             nan     0.0010    0.0002
##    480        1.0307             nan     0.0010    0.0002
##    500        1.0225             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3036             nan     0.0010    0.0004
##     40        1.2872             nan     0.0010    0.0004
##     60        1.2714             nan     0.0010    0.0004
##     80        1.2563             nan     0.0010    0.0003
##    100        1.2413             nan     0.0010    0.0003
##    120        1.2269             nan     0.0010    0.0003
##    140        1.2133             nan     0.0010    0.0003
##    160        1.1999             nan     0.0010    0.0003
##    180        1.1868             nan     0.0010    0.0003
##    200        1.1742             nan     0.0010    0.0002
##    220        1.1617             nan     0.0010    0.0003
##    240        1.1498             nan     0.0010    0.0003
##    260        1.1383             nan     0.0010    0.0002
##    280        1.1270             nan     0.0010    0.0003
##    300        1.1162             nan     0.0010    0.0002
##    320        1.1056             nan     0.0010    0.0002
##    340        1.0952             nan     0.0010    0.0002
##    360        1.0852             nan     0.0010    0.0002
##    380        1.0755             nan     0.0010    0.0002
##    400        1.0660             nan     0.0010    0.0002
##    420        1.0568             nan     0.0010    0.0002
##    440        1.0477             nan     0.0010    0.0002
##    460        1.0390             nan     0.0010    0.0002
##    480        1.0308             nan     0.0010    0.0001
##    500        1.0225             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0003
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2871             nan     0.0010    0.0003
##     60        1.2717             nan     0.0010    0.0003
##     80        1.2562             nan     0.0010    0.0004
##    100        1.2410             nan     0.0010    0.0004
##    120        1.2271             nan     0.0010    0.0003
##    140        1.2132             nan     0.0010    0.0003
##    160        1.1995             nan     0.0010    0.0002
##    180        1.1862             nan     0.0010    0.0003
##    200        1.1738             nan     0.0010    0.0003
##    220        1.1615             nan     0.0010    0.0002
##    240        1.1498             nan     0.0010    0.0003
##    260        1.1383             nan     0.0010    0.0003
##    280        1.1271             nan     0.0010    0.0003
##    300        1.1162             nan     0.0010    0.0002
##    320        1.1054             nan     0.0010    0.0002
##    340        1.0952             nan     0.0010    0.0002
##    360        1.0852             nan     0.0010    0.0002
##    380        1.0756             nan     0.0010    0.0002
##    400        1.0663             nan     0.0010    0.0002
##    420        1.0573             nan     0.0010    0.0002
##    440        1.0483             nan     0.0010    0.0002
##    460        1.0395             nan     0.0010    0.0002
##    480        1.0309             nan     0.0010    0.0002
##    500        1.0227             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0003
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0005
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2673             nan     0.0010    0.0004
##     80        1.2508             nan     0.0010    0.0004
##    100        1.2349             nan     0.0010    0.0004
##    120        1.2190             nan     0.0010    0.0003
##    140        1.2043             nan     0.0010    0.0003
##    160        1.1902             nan     0.0010    0.0003
##    180        1.1766             nan     0.0010    0.0003
##    200        1.1629             nan     0.0010    0.0003
##    220        1.1496             nan     0.0010    0.0003
##    240        1.1369             nan     0.0010    0.0003
##    260        1.1247             nan     0.0010    0.0003
##    280        1.1125             nan     0.0010    0.0003
##    300        1.1008             nan     0.0010    0.0003
##    320        1.0893             nan     0.0010    0.0002
##    340        1.0786             nan     0.0010    0.0002
##    360        1.0681             nan     0.0010    0.0002
##    380        1.0576             nan     0.0010    0.0002
##    400        1.0473             nan     0.0010    0.0002
##    420        1.0375             nan     0.0010    0.0002
##    440        1.0281             nan     0.0010    0.0002
##    460        1.0188             nan     0.0010    0.0002
##    480        1.0096             nan     0.0010    0.0002
##    500        1.0010             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0005
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2847             nan     0.0010    0.0004
##     60        1.2677             nan     0.0010    0.0004
##     80        1.2511             nan     0.0010    0.0003
##    100        1.2351             nan     0.0010    0.0003
##    120        1.2196             nan     0.0010    0.0004
##    140        1.2049             nan     0.0010    0.0003
##    160        1.1906             nan     0.0010    0.0003
##    180        1.1770             nan     0.0010    0.0003
##    200        1.1635             nan     0.0010    0.0003
##    220        1.1504             nan     0.0010    0.0003
##    240        1.1380             nan     0.0010    0.0003
##    260        1.1258             nan     0.0010    0.0003
##    280        1.1139             nan     0.0010    0.0003
##    300        1.1023             nan     0.0010    0.0002
##    320        1.0912             nan     0.0010    0.0002
##    340        1.0800             nan     0.0010    0.0003
##    360        1.0693             nan     0.0010    0.0002
##    380        1.0588             nan     0.0010    0.0002
##    400        1.0487             nan     0.0010    0.0002
##    420        1.0388             nan     0.0010    0.0002
##    440        1.0294             nan     0.0010    0.0002
##    460        1.0204             nan     0.0010    0.0002
##    480        1.0113             nan     0.0010    0.0002
##    500        1.0026             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0005
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2851             nan     0.0010    0.0004
##     60        1.2682             nan     0.0010    0.0004
##     80        1.2519             nan     0.0010    0.0004
##    100        1.2362             nan     0.0010    0.0003
##    120        1.2212             nan     0.0010    0.0003
##    140        1.2062             nan     0.0010    0.0003
##    160        1.1918             nan     0.0010    0.0003
##    180        1.1777             nan     0.0010    0.0003
##    200        1.1644             nan     0.0010    0.0003
##    220        1.1512             nan     0.0010    0.0003
##    240        1.1389             nan     0.0010    0.0003
##    260        1.1266             nan     0.0010    0.0003
##    280        1.1149             nan     0.0010    0.0002
##    300        1.1035             nan     0.0010    0.0002
##    320        1.0924             nan     0.0010    0.0002
##    340        1.0817             nan     0.0010    0.0002
##    360        1.0712             nan     0.0010    0.0002
##    380        1.0608             nan     0.0010    0.0002
##    400        1.0506             nan     0.0010    0.0002
##    420        1.0411             nan     0.0010    0.0002
##    440        1.0316             nan     0.0010    0.0002
##    460        1.0223             nan     0.0010    0.0002
##    480        1.0132             nan     0.0010    0.0002
##    500        1.0045             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3186             nan     0.0010    0.0004
##      3        1.3176             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3156             nan     0.0010    0.0005
##      6        1.3146             nan     0.0010    0.0004
##      7        1.3137             nan     0.0010    0.0004
##      8        1.3128             nan     0.0010    0.0004
##      9        1.3117             nan     0.0010    0.0005
##     10        1.3107             nan     0.0010    0.0004
##     20        1.3012             nan     0.0010    0.0004
##     40        1.2827             nan     0.0010    0.0004
##     60        1.2648             nan     0.0010    0.0004
##     80        1.2476             nan     0.0010    0.0004
##    100        1.2312             nan     0.0010    0.0003
##    120        1.2154             nan     0.0010    0.0003
##    140        1.1997             nan     0.0010    0.0004
##    160        1.1848             nan     0.0010    0.0004
##    180        1.1705             nan     0.0010    0.0003
##    200        1.1563             nan     0.0010    0.0003
##    220        1.1424             nan     0.0010    0.0003
##    240        1.1291             nan     0.0010    0.0003
##    260        1.1162             nan     0.0010    0.0003
##    280        1.1037             nan     0.0010    0.0003
##    300        1.0915             nan     0.0010    0.0003
##    320        1.0796             nan     0.0010    0.0002
##    340        1.0682             nan     0.0010    0.0003
##    360        1.0569             nan     0.0010    0.0003
##    380        1.0462             nan     0.0010    0.0002
##    400        1.0358             nan     0.0010    0.0002
##    420        1.0254             nan     0.0010    0.0002
##    440        1.0154             nan     0.0010    0.0002
##    460        1.0057             nan     0.0010    0.0002
##    480        0.9962             nan     0.0010    0.0002
##    500        0.9872             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3186             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0005
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3147             nan     0.0010    0.0005
##      7        1.3138             nan     0.0010    0.0004
##      8        1.3128             nan     0.0010    0.0004
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3109             nan     0.0010    0.0005
##     20        1.3016             nan     0.0010    0.0004
##     40        1.2832             nan     0.0010    0.0004
##     60        1.2650             nan     0.0010    0.0004
##     80        1.2479             nan     0.0010    0.0003
##    100        1.2314             nan     0.0010    0.0004
##    120        1.2156             nan     0.0010    0.0003
##    140        1.2000             nan     0.0010    0.0003
##    160        1.1849             nan     0.0010    0.0004
##    180        1.1702             nan     0.0010    0.0003
##    200        1.1564             nan     0.0010    0.0003
##    220        1.1429             nan     0.0010    0.0003
##    240        1.1298             nan     0.0010    0.0003
##    260        1.1171             nan     0.0010    0.0003
##    280        1.1048             nan     0.0010    0.0002
##    300        1.0926             nan     0.0010    0.0003
##    320        1.0810             nan     0.0010    0.0003
##    340        1.0695             nan     0.0010    0.0002
##    360        1.0585             nan     0.0010    0.0002
##    380        1.0479             nan     0.0010    0.0002
##    400        1.0376             nan     0.0010    0.0002
##    420        1.0273             nan     0.0010    0.0002
##    440        1.0175             nan     0.0010    0.0002
##    460        1.0076             nan     0.0010    0.0002
##    480        0.9983             nan     0.0010    0.0002
##    500        0.9891             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0005
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0005
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0004
##     60        1.2663             nan     0.0010    0.0004
##     80        1.2495             nan     0.0010    0.0004
##    100        1.2329             nan     0.0010    0.0004
##    120        1.2170             nan     0.0010    0.0004
##    140        1.2017             nan     0.0010    0.0003
##    160        1.1868             nan     0.0010    0.0003
##    180        1.1726             nan     0.0010    0.0003
##    200        1.1584             nan     0.0010    0.0003
##    220        1.1449             nan     0.0010    0.0003
##    240        1.1316             nan     0.0010    0.0003
##    260        1.1188             nan     0.0010    0.0003
##    280        1.1062             nan     0.0010    0.0003
##    300        1.0944             nan     0.0010    0.0002
##    320        1.0827             nan     0.0010    0.0002
##    340        1.0716             nan     0.0010    0.0003
##    360        1.0606             nan     0.0010    0.0002
##    380        1.0499             nan     0.0010    0.0002
##    400        1.0397             nan     0.0010    0.0002
##    420        1.0297             nan     0.0010    0.0003
##    440        1.0201             nan     0.0010    0.0002
##    460        1.0107             nan     0.0010    0.0002
##    480        1.0012             nan     0.0010    0.0002
##    500        0.9921             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0037
##      2        1.3028             nan     0.0100    0.0042
##      3        1.2944             nan     0.0100    0.0037
##      4        1.2863             nan     0.0100    0.0039
##      5        1.2785             nan     0.0100    0.0035
##      6        1.2702             nan     0.0100    0.0036
##      7        1.2620             nan     0.0100    0.0034
##      8        1.2548             nan     0.0100    0.0029
##      9        1.2475             nan     0.0100    0.0030
##     10        1.2408             nan     0.0100    0.0030
##     20        1.1745             nan     0.0100    0.0027
##     40        1.0667             nan     0.0100    0.0020
##     60        0.9843             nan     0.0100    0.0015
##     80        0.9185             nan     0.0100    0.0011
##    100        0.8660             nan     0.0100    0.0010
##    120        0.8224             nan     0.0100    0.0007
##    140        0.7855             nan     0.0100    0.0006
##    160        0.7553             nan     0.0100    0.0006
##    180        0.7299             nan     0.0100    0.0004
##    200        0.7062             nan     0.0100    0.0002
##    220        0.6859             nan     0.0100    0.0003
##    240        0.6689             nan     0.0100    0.0002
##    260        0.6536             nan     0.0100    0.0001
##    280        0.6375             nan     0.0100    0.0002
##    300        0.6244             nan     0.0100    0.0001
##    320        0.6112             nan     0.0100    0.0002
##    340        0.6002             nan     0.0100    0.0001
##    360        0.5892             nan     0.0100    0.0001
##    380        0.5788             nan     0.0100    0.0001
##    400        0.5693             nan     0.0100   -0.0000
##    420        0.5600             nan     0.0100    0.0001
##    440        0.5509             nan     0.0100    0.0000
##    460        0.5417             nan     0.0100    0.0001
##    480        0.5334             nan     0.0100   -0.0000
##    500        0.5253             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0042
##      2        1.3033             nan     0.0100    0.0037
##      3        1.2952             nan     0.0100    0.0035
##      4        1.2875             nan     0.0100    0.0035
##      5        1.2792             nan     0.0100    0.0035
##      6        1.2712             nan     0.0100    0.0037
##      7        1.2641             nan     0.0100    0.0032
##      8        1.2571             nan     0.0100    0.0029
##      9        1.2495             nan     0.0100    0.0034
##     10        1.2417             nan     0.0100    0.0036
##     20        1.1743             nan     0.0100    0.0026
##     40        1.0650             nan     0.0100    0.0021
##     60        0.9829             nan     0.0100    0.0012
##     80        0.9197             nan     0.0100    0.0009
##    100        0.8659             nan     0.0100    0.0010
##    120        0.8225             nan     0.0100    0.0007
##    140        0.7873             nan     0.0100    0.0004
##    160        0.7575             nan     0.0100    0.0004
##    180        0.7333             nan     0.0100    0.0003
##    200        0.7117             nan     0.0100   -0.0000
##    220        0.6912             nan     0.0100    0.0002
##    240        0.6743             nan     0.0100    0.0001
##    260        0.6600             nan     0.0100    0.0000
##    280        0.6456             nan     0.0100   -0.0001
##    300        0.6329             nan     0.0100    0.0001
##    320        0.6208             nan     0.0100   -0.0002
##    340        0.6086             nan     0.0100    0.0002
##    360        0.5982             nan     0.0100    0.0001
##    380        0.5875             nan     0.0100    0.0000
##    400        0.5780             nan     0.0100    0.0000
##    420        0.5690             nan     0.0100   -0.0001
##    440        0.5600             nan     0.0100   -0.0001
##    460        0.5510             nan     0.0100    0.0001
##    480        0.5426             nan     0.0100   -0.0000
##    500        0.5338             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0042
##      2        1.3031             nan     0.0100    0.0041
##      3        1.2954             nan     0.0100    0.0038
##      4        1.2878             nan     0.0100    0.0038
##      5        1.2791             nan     0.0100    0.0039
##      6        1.2711             nan     0.0100    0.0036
##      7        1.2630             nan     0.0100    0.0038
##      8        1.2549             nan     0.0100    0.0034
##      9        1.2475             nan     0.0100    0.0035
##     10        1.2399             nan     0.0100    0.0031
##     20        1.1725             nan     0.0100    0.0029
##     40        1.0655             nan     0.0100    0.0020
##     60        0.9843             nan     0.0100    0.0016
##     80        0.9197             nan     0.0100    0.0012
##    100        0.8686             nan     0.0100    0.0008
##    120        0.8266             nan     0.0100    0.0004
##    140        0.7905             nan     0.0100    0.0003
##    160        0.7605             nan     0.0100    0.0005
##    180        0.7356             nan     0.0100    0.0002
##    200        0.7132             nan     0.0100    0.0003
##    220        0.6939             nan     0.0100    0.0001
##    240        0.6779             nan     0.0100    0.0003
##    260        0.6624             nan     0.0100   -0.0000
##    280        0.6484             nan     0.0100    0.0000
##    300        0.6356             nan     0.0100    0.0002
##    320        0.6237             nan     0.0100    0.0000
##    340        0.6121             nan     0.0100    0.0000
##    360        0.6012             nan     0.0100   -0.0000
##    380        0.5911             nan     0.0100    0.0001
##    400        0.5818             nan     0.0100   -0.0001
##    420        0.5733             nan     0.0100   -0.0000
##    440        0.5644             nan     0.0100   -0.0000
##    460        0.5566             nan     0.0100   -0.0000
##    480        0.5482             nan     0.0100   -0.0000
##    500        0.5399             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0041
##      2        1.3045             nan     0.0100    0.0036
##      3        1.2959             nan     0.0100    0.0039
##      4        1.2874             nan     0.0100    0.0041
##      5        1.2793             nan     0.0100    0.0036
##      6        1.2701             nan     0.0100    0.0043
##      7        1.2630             nan     0.0100    0.0034
##      8        1.2545             nan     0.0100    0.0035
##      9        1.2469             nan     0.0100    0.0033
##     10        1.2384             nan     0.0100    0.0039
##     20        1.1639             nan     0.0100    0.0028
##     40        1.0483             nan     0.0100    0.0024
##     60        0.9616             nan     0.0100    0.0013
##     80        0.8919             nan     0.0100    0.0012
##    100        0.8371             nan     0.0100    0.0007
##    120        0.7926             nan     0.0100    0.0007
##    140        0.7541             nan     0.0100    0.0005
##    160        0.7217             nan     0.0100    0.0001
##    180        0.6935             nan     0.0100    0.0005
##    200        0.6706             nan     0.0100    0.0004
##    220        0.6496             nan     0.0100    0.0002
##    240        0.6303             nan     0.0100    0.0002
##    260        0.6126             nan     0.0100    0.0002
##    280        0.5961             nan     0.0100    0.0001
##    300        0.5813             nan     0.0100    0.0001
##    320        0.5673             nan     0.0100   -0.0001
##    340        0.5549             nan     0.0100   -0.0001
##    360        0.5426             nan     0.0100    0.0002
##    380        0.5307             nan     0.0100    0.0001
##    400        0.5198             nan     0.0100   -0.0003
##    420        0.5087             nan     0.0100   -0.0000
##    440        0.4978             nan     0.0100   -0.0000
##    460        0.4878             nan     0.0100    0.0001
##    480        0.4789             nan     0.0100   -0.0002
##    500        0.4699             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0044
##      2        1.3028             nan     0.0100    0.0039
##      3        1.2939             nan     0.0100    0.0040
##      4        1.2851             nan     0.0100    0.0036
##      5        1.2762             nan     0.0100    0.0038
##      6        1.2686             nan     0.0100    0.0034
##      7        1.2605             nan     0.0100    0.0034
##      8        1.2525             nan     0.0100    0.0032
##      9        1.2448             nan     0.0100    0.0035
##     10        1.2366             nan     0.0100    0.0038
##     20        1.1634             nan     0.0100    0.0034
##     40        1.0482             nan     0.0100    0.0023
##     60        0.9630             nan     0.0100    0.0015
##     80        0.8936             nan     0.0100    0.0011
##    100        0.8385             nan     0.0100    0.0011
##    120        0.7945             nan     0.0100    0.0008
##    140        0.7578             nan     0.0100    0.0004
##    160        0.7272             nan     0.0100    0.0007
##    180        0.7007             nan     0.0100    0.0002
##    200        0.6779             nan     0.0100    0.0001
##    220        0.6553             nan     0.0100    0.0004
##    240        0.6357             nan     0.0100    0.0003
##    260        0.6189             nan     0.0100    0.0002
##    280        0.6036             nan     0.0100    0.0000
##    300        0.5904             nan     0.0100    0.0002
##    320        0.5770             nan     0.0100    0.0001
##    340        0.5644             nan     0.0100    0.0000
##    360        0.5534             nan     0.0100    0.0001
##    380        0.5424             nan     0.0100    0.0000
##    400        0.5323             nan     0.0100   -0.0001
##    420        0.5219             nan     0.0100   -0.0000
##    440        0.5120             nan     0.0100    0.0000
##    460        0.5016             nan     0.0100   -0.0001
##    480        0.4926             nan     0.0100   -0.0000
##    500        0.4835             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0041
##      2        1.3027             nan     0.0100    0.0040
##      3        1.2938             nan     0.0100    0.0041
##      4        1.2853             nan     0.0100    0.0035
##      5        1.2761             nan     0.0100    0.0041
##      6        1.2680             nan     0.0100    0.0039
##      7        1.2598             nan     0.0100    0.0041
##      8        1.2534             nan     0.0100    0.0025
##      9        1.2453             nan     0.0100    0.0035
##     10        1.2375             nan     0.0100    0.0034
##     20        1.1671             nan     0.0100    0.0032
##     40        1.0523             nan     0.0100    0.0022
##     60        0.9657             nan     0.0100    0.0018
##     80        0.8975             nan     0.0100    0.0016
##    100        0.8427             nan     0.0100    0.0010
##    120        0.7978             nan     0.0100    0.0007
##    140        0.7609             nan     0.0100    0.0003
##    160        0.7308             nan     0.0100    0.0004
##    180        0.7046             nan     0.0100    0.0002
##    200        0.6823             nan     0.0100    0.0001
##    220        0.6624             nan     0.0100    0.0002
##    240        0.6449             nan     0.0100    0.0002
##    260        0.6284             nan     0.0100   -0.0001
##    280        0.6141             nan     0.0100   -0.0000
##    300        0.5997             nan     0.0100   -0.0000
##    320        0.5862             nan     0.0100    0.0000
##    340        0.5742             nan     0.0100    0.0000
##    360        0.5625             nan     0.0100    0.0000
##    380        0.5505             nan     0.0100    0.0001
##    400        0.5395             nan     0.0100    0.0000
##    420        0.5289             nan     0.0100    0.0000
##    440        0.5200             nan     0.0100    0.0001
##    460        0.5110             nan     0.0100   -0.0002
##    480        0.5013             nan     0.0100   -0.0000
##    500        0.4928             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3107             nan     0.0100    0.0044
##      2        1.3015             nan     0.0100    0.0042
##      3        1.2925             nan     0.0100    0.0042
##      4        1.2834             nan     0.0100    0.0038
##      5        1.2741             nan     0.0100    0.0042
##      6        1.2650             nan     0.0100    0.0038
##      7        1.2571             nan     0.0100    0.0035
##      8        1.2481             nan     0.0100    0.0040
##      9        1.2393             nan     0.0100    0.0041
##     10        1.2304             nan     0.0100    0.0039
##     20        1.1535             nan     0.0100    0.0029
##     40        1.0320             nan     0.0100    0.0023
##     60        0.9427             nan     0.0100    0.0017
##     80        0.8725             nan     0.0100    0.0011
##    100        0.8125             nan     0.0100    0.0009
##    120        0.7670             nan     0.0100    0.0007
##    140        0.7291             nan     0.0100    0.0006
##    160        0.6942             nan     0.0100    0.0003
##    180        0.6655             nan     0.0100    0.0003
##    200        0.6399             nan     0.0100    0.0002
##    220        0.6175             nan     0.0100    0.0002
##    240        0.5989             nan     0.0100    0.0001
##    260        0.5805             nan     0.0100    0.0003
##    280        0.5633             nan     0.0100    0.0000
##    300        0.5475             nan     0.0100    0.0001
##    320        0.5330             nan     0.0100    0.0002
##    340        0.5195             nan     0.0100   -0.0001
##    360        0.5062             nan     0.0100    0.0000
##    380        0.4934             nan     0.0100   -0.0000
##    400        0.4815             nan     0.0100    0.0000
##    420        0.4701             nan     0.0100    0.0001
##    440        0.4591             nan     0.0100   -0.0000
##    460        0.4484             nan     0.0100    0.0000
##    480        0.4386             nan     0.0100   -0.0000
##    500        0.4291             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0044
##      2        1.3024             nan     0.0100    0.0038
##      3        1.2936             nan     0.0100    0.0040
##      4        1.2850             nan     0.0100    0.0043
##      5        1.2768             nan     0.0100    0.0038
##      6        1.2684             nan     0.0100    0.0039
##      7        1.2601             nan     0.0100    0.0034
##      8        1.2517             nan     0.0100    0.0037
##      9        1.2429             nan     0.0100    0.0039
##     10        1.2359             nan     0.0100    0.0031
##     20        1.1588             nan     0.0100    0.0031
##     40        1.0375             nan     0.0100    0.0023
##     60        0.9466             nan     0.0100    0.0018
##     80        0.8756             nan     0.0100    0.0013
##    100        0.8188             nan     0.0100    0.0011
##    120        0.7723             nan     0.0100    0.0006
##    140        0.7322             nan     0.0100    0.0005
##    160        0.6997             nan     0.0100    0.0001
##    180        0.6712             nan     0.0100    0.0006
##    200        0.6475             nan     0.0100   -0.0000
##    220        0.6262             nan     0.0100    0.0003
##    240        0.6063             nan     0.0100    0.0001
##    260        0.5877             nan     0.0100    0.0000
##    280        0.5707             nan     0.0100   -0.0000
##    300        0.5546             nan     0.0100    0.0002
##    320        0.5404             nan     0.0100    0.0000
##    340        0.5263             nan     0.0100   -0.0000
##    360        0.5134             nan     0.0100   -0.0002
##    380        0.5004             nan     0.0100    0.0001
##    400        0.4891             nan     0.0100   -0.0000
##    420        0.4783             nan     0.0100    0.0000
##    440        0.4670             nan     0.0100   -0.0001
##    460        0.4569             nan     0.0100   -0.0000
##    480        0.4470             nan     0.0100   -0.0001
##    500        0.4368             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3021             nan     0.0100    0.0042
##      3        1.2940             nan     0.0100    0.0037
##      4        1.2850             nan     0.0100    0.0042
##      5        1.2766             nan     0.0100    0.0035
##      6        1.2673             nan     0.0100    0.0041
##      7        1.2584             nan     0.0100    0.0038
##      8        1.2503             nan     0.0100    0.0037
##      9        1.2416             nan     0.0100    0.0040
##     10        1.2333             nan     0.0100    0.0035
##     20        1.1588             nan     0.0100    0.0025
##     40        1.0415             nan     0.0100    0.0022
##     60        0.9506             nan     0.0100    0.0013
##     80        0.8795             nan     0.0100    0.0013
##    100        0.8227             nan     0.0100    0.0009
##    120        0.7769             nan     0.0100    0.0006
##    140        0.7405             nan     0.0100    0.0004
##    160        0.7085             nan     0.0100    0.0003
##    180        0.6803             nan     0.0100    0.0003
##    200        0.6557             nan     0.0100    0.0004
##    220        0.6354             nan     0.0100    0.0001
##    240        0.6158             nan     0.0100   -0.0001
##    260        0.5976             nan     0.0100    0.0000
##    280        0.5815             nan     0.0100   -0.0000
##    300        0.5665             nan     0.0100    0.0000
##    320        0.5530             nan     0.0100    0.0001
##    340        0.5387             nan     0.0100    0.0000
##    360        0.5250             nan     0.0100    0.0001
##    380        0.5142             nan     0.0100   -0.0001
##    400        0.5030             nan     0.0100   -0.0000
##    420        0.4918             nan     0.0100   -0.0000
##    440        0.4821             nan     0.0100   -0.0001
##    460        0.4723             nan     0.0100    0.0001
##    480        0.4622             nan     0.0100   -0.0002
##    500        0.4530             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2390             nan     0.1000    0.0380
##      2        1.1665             nan     0.1000    0.0319
##      3        1.1073             nan     0.1000    0.0276
##      4        1.0578             nan     0.1000    0.0233
##      5        1.0142             nan     0.1000    0.0174
##      6        0.9747             nan     0.1000    0.0176
##      7        0.9387             nan     0.1000    0.0154
##      8        0.9123             nan     0.1000    0.0097
##      9        0.8800             nan     0.1000    0.0125
##     10        0.8553             nan     0.1000    0.0080
##     20        0.7077             nan     0.1000    0.0027
##     40        0.5731             nan     0.1000    0.0009
##     60        0.4900             nan     0.1000    0.0001
##     80        0.4305             nan     0.1000   -0.0009
##    100        0.3739             nan     0.1000   -0.0001
##    120        0.3309             nan     0.1000   -0.0004
##    140        0.2939             nan     0.1000   -0.0008
##    160        0.2636             nan     0.1000   -0.0009
##    180        0.2338             nan     0.1000   -0.0000
##    200        0.2118             nan     0.1000   -0.0008
##    220        0.1925             nan     0.1000   -0.0008
##    240        0.1745             nan     0.1000   -0.0002
##    260        0.1603             nan     0.1000   -0.0003
##    280        0.1469             nan     0.1000   -0.0003
##    300        0.1329             nan     0.1000   -0.0002
##    320        0.1216             nan     0.1000   -0.0005
##    340        0.1108             nan     0.1000   -0.0002
##    360        0.1004             nan     0.1000   -0.0001
##    380        0.0921             nan     0.1000   -0.0004
##    400        0.0853             nan     0.1000   -0.0003
##    420        0.0783             nan     0.1000   -0.0003
##    440        0.0716             nan     0.1000   -0.0002
##    460        0.0666             nan     0.1000   -0.0002
##    480        0.0613             nan     0.1000   -0.0000
##    500        0.0565             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2369             nan     0.1000    0.0412
##      2        1.1689             nan     0.1000    0.0320
##      3        1.1105             nan     0.1000    0.0250
##      4        1.0586             nan     0.1000    0.0243
##      5        1.0181             nan     0.1000    0.0192
##      6        0.9777             nan     0.1000    0.0170
##      7        0.9444             nan     0.1000    0.0135
##      8        0.9154             nan     0.1000    0.0115
##      9        0.8888             nan     0.1000    0.0108
##     10        0.8663             nan     0.1000    0.0094
##     20        0.7118             nan     0.1000    0.0033
##     40        0.5855             nan     0.1000   -0.0007
##     60        0.4978             nan     0.1000    0.0006
##     80        0.4275             nan     0.1000   -0.0014
##    100        0.3774             nan     0.1000   -0.0004
##    120        0.3343             nan     0.1000   -0.0009
##    140        0.3007             nan     0.1000   -0.0009
##    160        0.2708             nan     0.1000   -0.0003
##    180        0.2468             nan     0.1000   -0.0004
##    200        0.2220             nan     0.1000   -0.0004
##    220        0.2044             nan     0.1000   -0.0005
##    240        0.1848             nan     0.1000   -0.0007
##    260        0.1676             nan     0.1000   -0.0005
##    280        0.1525             nan     0.1000   -0.0003
##    300        0.1396             nan     0.1000   -0.0004
##    320        0.1273             nan     0.1000   -0.0003
##    340        0.1163             nan     0.1000   -0.0003
##    360        0.1079             nan     0.1000   -0.0002
##    380        0.0992             nan     0.1000   -0.0002
##    400        0.0920             nan     0.1000   -0.0003
##    420        0.0838             nan     0.1000   -0.0003
##    440        0.0771             nan     0.1000   -0.0001
##    460        0.0704             nan     0.1000   -0.0002
##    480        0.0651             nan     0.1000   -0.0001
##    500        0.0603             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2496             nan     0.1000    0.0333
##      2        1.1719             nan     0.1000    0.0303
##      3        1.1162             nan     0.1000    0.0259
##      4        1.0687             nan     0.1000    0.0211
##      5        1.0250             nan     0.1000    0.0177
##      6        0.9855             nan     0.1000    0.0155
##      7        0.9543             nan     0.1000    0.0133
##      8        0.9224             nan     0.1000    0.0141
##      9        0.8965             nan     0.1000    0.0093
##     10        0.8699             nan     0.1000    0.0072
##     20        0.7147             nan     0.1000    0.0016
##     40        0.5858             nan     0.1000   -0.0009
##     60        0.5076             nan     0.1000    0.0007
##     80        0.4488             nan     0.1000   -0.0008
##    100        0.4018             nan     0.1000   -0.0015
##    120        0.3611             nan     0.1000   -0.0001
##    140        0.3204             nan     0.1000   -0.0006
##    160        0.2904             nan     0.1000   -0.0005
##    180        0.2645             nan     0.1000   -0.0003
##    200        0.2405             nan     0.1000   -0.0007
##    220        0.2211             nan     0.1000   -0.0009
##    240        0.2000             nan     0.1000   -0.0006
##    260        0.1821             nan     0.1000   -0.0004
##    280        0.1678             nan     0.1000   -0.0003
##    300        0.1559             nan     0.1000   -0.0007
##    320        0.1430             nan     0.1000   -0.0003
##    340        0.1316             nan     0.1000   -0.0004
##    360        0.1214             nan     0.1000   -0.0006
##    380        0.1110             nan     0.1000   -0.0003
##    400        0.1032             nan     0.1000   -0.0006
##    420        0.0954             nan     0.1000   -0.0001
##    440        0.0884             nan     0.1000   -0.0001
##    460        0.0818             nan     0.1000   -0.0002
##    480        0.0755             nan     0.1000   -0.0001
##    500        0.0691             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2306             nan     0.1000    0.0430
##      2        1.1588             nan     0.1000    0.0289
##      3        1.0998             nan     0.1000    0.0236
##      4        1.0497             nan     0.1000    0.0222
##      5        1.0015             nan     0.1000    0.0186
##      6        0.9591             nan     0.1000    0.0180
##      7        0.9186             nan     0.1000    0.0154
##      8        0.8886             nan     0.1000    0.0107
##      9        0.8656             nan     0.1000    0.0094
##     10        0.8408             nan     0.1000    0.0088
##     20        0.6858             nan     0.1000   -0.0001
##     40        0.5346             nan     0.1000    0.0001
##     60        0.4391             nan     0.1000   -0.0006
##     80        0.3686             nan     0.1000   -0.0012
##    100        0.3193             nan     0.1000   -0.0000
##    120        0.2781             nan     0.1000   -0.0006
##    140        0.2422             nan     0.1000   -0.0003
##    160        0.2120             nan     0.1000   -0.0009
##    180        0.1864             nan     0.1000   -0.0003
##    200        0.1643             nan     0.1000    0.0002
##    220        0.1464             nan     0.1000   -0.0002
##    240        0.1292             nan     0.1000   -0.0000
##    260        0.1153             nan     0.1000   -0.0000
##    280        0.1020             nan     0.1000   -0.0003
##    300        0.0910             nan     0.1000   -0.0003
##    320        0.0813             nan     0.1000   -0.0003
##    340        0.0729             nan     0.1000   -0.0001
##    360        0.0660             nan     0.1000   -0.0000
##    380        0.0593             nan     0.1000   -0.0001
##    400        0.0538             nan     0.1000   -0.0002
##    420        0.0490             nan     0.1000   -0.0002
##    440        0.0442             nan     0.1000   -0.0000
##    460        0.0402             nan     0.1000   -0.0001
##    480        0.0363             nan     0.1000   -0.0002
##    500        0.0331             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2391             nan     0.1000    0.0386
##      2        1.1734             nan     0.1000    0.0307
##      3        1.1095             nan     0.1000    0.0257
##      4        1.0490             nan     0.1000    0.0246
##      5        1.0059             nan     0.1000    0.0176
##      6        0.9661             nan     0.1000    0.0159
##      7        0.9243             nan     0.1000    0.0182
##      8        0.8936             nan     0.1000    0.0117
##      9        0.8665             nan     0.1000    0.0107
##     10        0.8453             nan     0.1000    0.0079
##     20        0.6894             nan     0.1000    0.0037
##     40        0.5419             nan     0.1000    0.0010
##     60        0.4525             nan     0.1000   -0.0012
##     80        0.3904             nan     0.1000   -0.0015
##    100        0.3337             nan     0.1000   -0.0006
##    120        0.2854             nan     0.1000   -0.0010
##    140        0.2528             nan     0.1000   -0.0008
##    160        0.2218             nan     0.1000   -0.0001
##    180        0.1954             nan     0.1000   -0.0011
##    200        0.1702             nan     0.1000   -0.0003
##    220        0.1499             nan     0.1000   -0.0004
##    240        0.1330             nan     0.1000   -0.0005
##    260        0.1181             nan     0.1000   -0.0002
##    280        0.1062             nan     0.1000   -0.0004
##    300        0.0939             nan     0.1000   -0.0001
##    320        0.0844             nan     0.1000   -0.0001
##    340        0.0757             nan     0.1000   -0.0001
##    360        0.0682             nan     0.1000   -0.0001
##    380        0.0613             nan     0.1000   -0.0001
##    400        0.0554             nan     0.1000   -0.0002
##    420        0.0487             nan     0.1000    0.0000
##    440        0.0435             nan     0.1000   -0.0002
##    460        0.0398             nan     0.1000   -0.0001
##    480        0.0358             nan     0.1000   -0.0001
##    500        0.0324             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2307             nan     0.1000    0.0411
##      2        1.1535             nan     0.1000    0.0329
##      3        1.0868             nan     0.1000    0.0313
##      4        1.0425             nan     0.1000    0.0175
##      5        0.9927             nan     0.1000    0.0222
##      6        0.9541             nan     0.1000    0.0155
##      7        0.9208             nan     0.1000    0.0111
##      8        0.8895             nan     0.1000    0.0102
##      9        0.8605             nan     0.1000    0.0114
##     10        0.8381             nan     0.1000    0.0080
##     20        0.6784             nan     0.1000    0.0005
##     40        0.5425             nan     0.1000   -0.0001
##     60        0.4600             nan     0.1000   -0.0002
##     80        0.3838             nan     0.1000   -0.0009
##    100        0.3312             nan     0.1000   -0.0002
##    120        0.2857             nan     0.1000   -0.0006
##    140        0.2531             nan     0.1000   -0.0006
##    160        0.2224             nan     0.1000   -0.0011
##    180        0.1965             nan     0.1000   -0.0001
##    200        0.1739             nan     0.1000   -0.0004
##    220        0.1561             nan     0.1000   -0.0006
##    240        0.1391             nan     0.1000   -0.0009
##    260        0.1247             nan     0.1000   -0.0004
##    280        0.1112             nan     0.1000   -0.0004
##    300        0.1004             nan     0.1000   -0.0002
##    320        0.0897             nan     0.1000   -0.0003
##    340        0.0798             nan     0.1000   -0.0001
##    360        0.0722             nan     0.1000   -0.0002
##    380        0.0655             nan     0.1000   -0.0001
##    400        0.0595             nan     0.1000   -0.0002
##    420        0.0539             nan     0.1000   -0.0002
##    440        0.0487             nan     0.1000   -0.0002
##    460        0.0446             nan     0.1000   -0.0000
##    480        0.0401             nan     0.1000   -0.0001
##    500        0.0364             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2249             nan     0.1000    0.0421
##      2        1.1499             nan     0.1000    0.0300
##      3        1.0932             nan     0.1000    0.0243
##      4        1.0371             nan     0.1000    0.0266
##      5        0.9873             nan     0.1000    0.0198
##      6        0.9508             nan     0.1000    0.0134
##      7        0.9084             nan     0.1000    0.0179
##      8        0.8696             nan     0.1000    0.0148
##      9        0.8389             nan     0.1000    0.0116
##     10        0.8166             nan     0.1000    0.0085
##     20        0.6429             nan     0.1000   -0.0003
##     40        0.4837             nan     0.1000   -0.0001
##     60        0.3937             nan     0.1000   -0.0002
##     80        0.3186             nan     0.1000   -0.0012
##    100        0.2702             nan     0.1000   -0.0006
##    120        0.2227             nan     0.1000   -0.0004
##    140        0.1881             nan     0.1000   -0.0003
##    160        0.1589             nan     0.1000   -0.0000
##    180        0.1381             nan     0.1000   -0.0004
##    200        0.1211             nan     0.1000   -0.0001
##    220        0.1044             nan     0.1000   -0.0004
##    240        0.0914             nan     0.1000   -0.0002
##    260        0.0797             nan     0.1000   -0.0000
##    280        0.0697             nan     0.1000   -0.0002
##    300        0.0613             nan     0.1000   -0.0001
##    320        0.0542             nan     0.1000   -0.0000
##    340        0.0479             nan     0.1000    0.0000
##    360        0.0428             nan     0.1000   -0.0001
##    380        0.0379             nan     0.1000   -0.0000
##    400        0.0335             nan     0.1000   -0.0000
##    420        0.0296             nan     0.1000    0.0000
##    440        0.0261             nan     0.1000   -0.0001
##    460        0.0231             nan     0.1000   -0.0000
##    480        0.0203             nan     0.1000   -0.0000
##    500        0.0183             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2195             nan     0.1000    0.0448
##      2        1.1484             nan     0.1000    0.0369
##      3        1.0880             nan     0.1000    0.0233
##      4        1.0278             nan     0.1000    0.0232
##      5        0.9807             nan     0.1000    0.0216
##      6        0.9381             nan     0.1000    0.0171
##      7        0.9073             nan     0.1000    0.0133
##      8        0.8744             nan     0.1000    0.0123
##      9        0.8469             nan     0.1000    0.0114
##     10        0.8215             nan     0.1000    0.0106
##     20        0.6548             nan     0.1000    0.0032
##     40        0.4957             nan     0.1000   -0.0002
##     60        0.4047             nan     0.1000   -0.0011
##     80        0.3350             nan     0.1000   -0.0006
##    100        0.2772             nan     0.1000    0.0004
##    120        0.2322             nan     0.1000   -0.0001
##    140        0.1944             nan     0.1000   -0.0011
##    160        0.1683             nan     0.1000   -0.0000
##    180        0.1438             nan     0.1000   -0.0005
##    200        0.1232             nan     0.1000   -0.0004
##    220        0.1083             nan     0.1000   -0.0006
##    240        0.0947             nan     0.1000   -0.0001
##    260        0.0823             nan     0.1000   -0.0003
##    280        0.0716             nan     0.1000   -0.0002
##    300        0.0622             nan     0.1000   -0.0002
##    320        0.0546             nan     0.1000   -0.0000
##    340        0.0479             nan     0.1000   -0.0001
##    360        0.0420             nan     0.1000   -0.0002
##    380        0.0369             nan     0.1000   -0.0002
##    400        0.0326             nan     0.1000   -0.0001
##    420        0.0289             nan     0.1000   -0.0000
##    440        0.0255             nan     0.1000   -0.0001
##    460        0.0229             nan     0.1000   -0.0001
##    480        0.0202             nan     0.1000   -0.0000
##    500        0.0180             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2224             nan     0.1000    0.0457
##      2        1.1503             nan     0.1000    0.0303
##      3        1.0878             nan     0.1000    0.0276
##      4        1.0377             nan     0.1000    0.0234
##      5        0.9947             nan     0.1000    0.0189
##      6        0.9563             nan     0.1000    0.0163
##      7        0.9160             nan     0.1000    0.0170
##      8        0.8867             nan     0.1000    0.0112
##      9        0.8554             nan     0.1000    0.0125
##     10        0.8264             nan     0.1000    0.0111
##     20        0.6587             nan     0.1000    0.0023
##     40        0.5114             nan     0.1000   -0.0014
##     60        0.4074             nan     0.1000   -0.0007
##     80        0.3376             nan     0.1000    0.0001
##    100        0.2861             nan     0.1000   -0.0009
##    120        0.2463             nan     0.1000   -0.0012
##    140        0.2117             nan     0.1000   -0.0007
##    160        0.1808             nan     0.1000   -0.0009
##    180        0.1555             nan     0.1000   -0.0004
##    200        0.1339             nan     0.1000   -0.0006
##    220        0.1147             nan     0.1000   -0.0002
##    240        0.0991             nan     0.1000   -0.0006
##    260        0.0877             nan     0.1000   -0.0003
##    280        0.0766             nan     0.1000   -0.0003
##    300        0.0670             nan     0.1000   -0.0001
##    320        0.0583             nan     0.1000   -0.0001
##    340        0.0518             nan     0.1000   -0.0002
##    360        0.0463             nan     0.1000   -0.0002
##    380        0.0407             nan     0.1000   -0.0000
##    400        0.0361             nan     0.1000   -0.0002
##    420        0.0318             nan     0.1000   -0.0001
##    440        0.0281             nan     0.1000   -0.0000
##    460        0.0251             nan     0.1000   -0.0001
##    480        0.0223             nan     0.1000   -0.0002
##    500        0.0198             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3143             nan     0.0010    0.0003
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0003
##     20        1.3046             nan     0.0010    0.0003
##     40        1.2880             nan     0.0010    0.0003
##     60        1.2723             nan     0.0010    0.0004
##     80        1.2574             nan     0.0010    0.0003
##    100        1.2430             nan     0.0010    0.0003
##    120        1.2288             nan     0.0010    0.0003
##    140        1.2153             nan     0.0010    0.0003
##    160        1.2018             nan     0.0010    0.0003
##    180        1.1895             nan     0.0010    0.0002
##    200        1.1769             nan     0.0010    0.0003
##    220        1.1651             nan     0.0010    0.0002
##    240        1.1536             nan     0.0010    0.0002
##    260        1.1422             nan     0.0010    0.0003
##    280        1.1312             nan     0.0010    0.0003
##    300        1.1208             nan     0.0010    0.0003
##    320        1.1103             nan     0.0010    0.0002
##    340        1.1001             nan     0.0010    0.0002
##    360        1.0904             nan     0.0010    0.0002
##    380        1.0807             nan     0.0010    0.0002
##    400        1.0717             nan     0.0010    0.0002
##    420        1.0627             nan     0.0010    0.0002
##    440        1.0539             nan     0.0010    0.0002
##    460        1.0455             nan     0.0010    0.0002
##    480        1.0371             nan     0.0010    0.0002
##    500        1.0289             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0003
##      4        1.3179             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3162             nan     0.0010    0.0004
##      7        1.3154             nan     0.0010    0.0004
##      8        1.3145             nan     0.0010    0.0004
##      9        1.3137             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3045             nan     0.0010    0.0004
##     40        1.2878             nan     0.0010    0.0004
##     60        1.2720             nan     0.0010    0.0003
##     80        1.2569             nan     0.0010    0.0004
##    100        1.2426             nan     0.0010    0.0004
##    120        1.2287             nan     0.0010    0.0003
##    140        1.2148             nan     0.0010    0.0003
##    160        1.2016             nan     0.0010    0.0002
##    180        1.1891             nan     0.0010    0.0003
##    200        1.1767             nan     0.0010    0.0003
##    220        1.1650             nan     0.0010    0.0002
##    240        1.1533             nan     0.0010    0.0003
##    260        1.1418             nan     0.0010    0.0002
##    280        1.1310             nan     0.0010    0.0002
##    300        1.1202             nan     0.0010    0.0003
##    320        1.1099             nan     0.0010    0.0002
##    340        1.1000             nan     0.0010    0.0002
##    360        1.0903             nan     0.0010    0.0002
##    380        1.0807             nan     0.0010    0.0002
##    400        1.0718             nan     0.0010    0.0002
##    420        1.0629             nan     0.0010    0.0002
##    440        1.0541             nan     0.0010    0.0002
##    460        1.0456             nan     0.0010    0.0002
##    480        1.0373             nan     0.0010    0.0002
##    500        1.0292             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3169             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3143             nan     0.0010    0.0004
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0004
##     20        1.3042             nan     0.0010    0.0004
##     40        1.2882             nan     0.0010    0.0004
##     60        1.2729             nan     0.0010    0.0003
##     80        1.2581             nan     0.0010    0.0004
##    100        1.2435             nan     0.0010    0.0003
##    120        1.2293             nan     0.0010    0.0003
##    140        1.2159             nan     0.0010    0.0003
##    160        1.2027             nan     0.0010    0.0003
##    180        1.1898             nan     0.0010    0.0003
##    200        1.1779             nan     0.0010    0.0002
##    220        1.1662             nan     0.0010    0.0003
##    240        1.1547             nan     0.0010    0.0002
##    260        1.1435             nan     0.0010    0.0003
##    280        1.1327             nan     0.0010    0.0002
##    300        1.1219             nan     0.0010    0.0002
##    320        1.1117             nan     0.0010    0.0002
##    340        1.1015             nan     0.0010    0.0002
##    360        1.0918             nan     0.0010    0.0002
##    380        1.0825             nan     0.0010    0.0002
##    400        1.0735             nan     0.0010    0.0002
##    420        1.0646             nan     0.0010    0.0002
##    440        1.0559             nan     0.0010    0.0002
##    460        1.0474             nan     0.0010    0.0002
##    480        1.0392             nan     0.0010    0.0001
##    500        1.0313             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2690             nan     0.0010    0.0004
##     80        1.2528             nan     0.0010    0.0003
##    100        1.2375             nan     0.0010    0.0004
##    120        1.2227             nan     0.0010    0.0003
##    140        1.2080             nan     0.0010    0.0003
##    160        1.1942             nan     0.0010    0.0003
##    180        1.1805             nan     0.0010    0.0003
##    200        1.1672             nan     0.0010    0.0003
##    220        1.1544             nan     0.0010    0.0003
##    240        1.1420             nan     0.0010    0.0003
##    260        1.1299             nan     0.0010    0.0002
##    280        1.1185             nan     0.0010    0.0003
##    300        1.1072             nan     0.0010    0.0003
##    320        1.0963             nan     0.0010    0.0002
##    340        1.0857             nan     0.0010    0.0002
##    360        1.0752             nan     0.0010    0.0002
##    380        1.0654             nan     0.0010    0.0002
##    400        1.0556             nan     0.0010    0.0002
##    420        1.0461             nan     0.0010    0.0002
##    440        1.0371             nan     0.0010    0.0002
##    460        1.0280             nan     0.0010    0.0002
##    480        1.0195             nan     0.0010    0.0002
##    500        1.0111             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0003
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2866             nan     0.0010    0.0004
##     60        1.2700             nan     0.0010    0.0004
##     80        1.2541             nan     0.0010    0.0003
##    100        1.2387             nan     0.0010    0.0003
##    120        1.2237             nan     0.0010    0.0003
##    140        1.2095             nan     0.0010    0.0003
##    160        1.1958             nan     0.0010    0.0003
##    180        1.1824             nan     0.0010    0.0003
##    200        1.1693             nan     0.0010    0.0003
##    220        1.1566             nan     0.0010    0.0003
##    240        1.1445             nan     0.0010    0.0002
##    260        1.1325             nan     0.0010    0.0002
##    280        1.1211             nan     0.0010    0.0002
##    300        1.1098             nan     0.0010    0.0002
##    320        1.0989             nan     0.0010    0.0002
##    340        1.0885             nan     0.0010    0.0002
##    360        1.0781             nan     0.0010    0.0002
##    380        1.0682             nan     0.0010    0.0002
##    400        1.0584             nan     0.0010    0.0002
##    420        1.0490             nan     0.0010    0.0002
##    440        1.0399             nan     0.0010    0.0002
##    460        1.0309             nan     0.0010    0.0002
##    480        1.0222             nan     0.0010    0.0002
##    500        1.0137             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0003
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2703             nan     0.0010    0.0004
##     80        1.2546             nan     0.0010    0.0004
##    100        1.2395             nan     0.0010    0.0003
##    120        1.2248             nan     0.0010    0.0004
##    140        1.2106             nan     0.0010    0.0003
##    160        1.1966             nan     0.0010    0.0003
##    180        1.1831             nan     0.0010    0.0003
##    200        1.1699             nan     0.0010    0.0003
##    220        1.1573             nan     0.0010    0.0003
##    240        1.1451             nan     0.0010    0.0003
##    260        1.1331             nan     0.0010    0.0003
##    280        1.1216             nan     0.0010    0.0003
##    300        1.1105             nan     0.0010    0.0003
##    320        1.0999             nan     0.0010    0.0002
##    340        1.0894             nan     0.0010    0.0002
##    360        1.0794             nan     0.0010    0.0002
##    380        1.0694             nan     0.0010    0.0002
##    400        1.0598             nan     0.0010    0.0002
##    420        1.0505             nan     0.0010    0.0002
##    440        1.0414             nan     0.0010    0.0002
##    460        1.0326             nan     0.0010    0.0002
##    480        1.0242             nan     0.0010    0.0001
##    500        1.0158             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2503             nan     0.0010    0.0004
##    100        1.2342             nan     0.0010    0.0003
##    120        1.2187             nan     0.0010    0.0004
##    140        1.2038             nan     0.0010    0.0003
##    160        1.1894             nan     0.0010    0.0003
##    180        1.1752             nan     0.0010    0.0003
##    200        1.1616             nan     0.0010    0.0003
##    220        1.1484             nan     0.0010    0.0002
##    240        1.1353             nan     0.0010    0.0003
##    260        1.1228             nan     0.0010    0.0003
##    280        1.1108             nan     0.0010    0.0003
##    300        1.0992             nan     0.0010    0.0002
##    320        1.0879             nan     0.0010    0.0003
##    340        1.0768             nan     0.0010    0.0002
##    360        1.0656             nan     0.0010    0.0002
##    380        1.0549             nan     0.0010    0.0002
##    400        1.0445             nan     0.0010    0.0002
##    420        1.0347             nan     0.0010    0.0002
##    440        1.0251             nan     0.0010    0.0001
##    460        1.0157             nan     0.0010    0.0002
##    480        1.0067             nan     0.0010    0.0001
##    500        0.9976             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0005
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2846             nan     0.0010    0.0004
##     60        1.2669             nan     0.0010    0.0004
##     80        1.2502             nan     0.0010    0.0004
##    100        1.2339             nan     0.0010    0.0003
##    120        1.2184             nan     0.0010    0.0003
##    140        1.2036             nan     0.0010    0.0003
##    160        1.1888             nan     0.0010    0.0003
##    180        1.1748             nan     0.0010    0.0003
##    200        1.1612             nan     0.0010    0.0003
##    220        1.1478             nan     0.0010    0.0003
##    240        1.1351             nan     0.0010    0.0003
##    260        1.1229             nan     0.0010    0.0002
##    280        1.1110             nan     0.0010    0.0003
##    300        1.0994             nan     0.0010    0.0002
##    320        1.0883             nan     0.0010    0.0002
##    340        1.0775             nan     0.0010    0.0002
##    360        1.0670             nan     0.0010    0.0002
##    380        1.0566             nan     0.0010    0.0002
##    400        1.0463             nan     0.0010    0.0002
##    420        1.0367             nan     0.0010    0.0002
##    440        1.0274             nan     0.0010    0.0002
##    460        1.0179             nan     0.0010    0.0002
##    480        1.0088             nan     0.0010    0.0002
##    500        1.0000             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2685             nan     0.0010    0.0003
##     80        1.2521             nan     0.0010    0.0004
##    100        1.2363             nan     0.0010    0.0003
##    120        1.2208             nan     0.0010    0.0004
##    140        1.2060             nan     0.0010    0.0003
##    160        1.1918             nan     0.0010    0.0003
##    180        1.1779             nan     0.0010    0.0003
##    200        1.1646             nan     0.0010    0.0003
##    220        1.1515             nan     0.0010    0.0003
##    240        1.1390             nan     0.0010    0.0003
##    260        1.1270             nan     0.0010    0.0003
##    280        1.1152             nan     0.0010    0.0003
##    300        1.1038             nan     0.0010    0.0002
##    320        1.0925             nan     0.0010    0.0002
##    340        1.0816             nan     0.0010    0.0003
##    360        1.0711             nan     0.0010    0.0002
##    380        1.0607             nan     0.0010    0.0002
##    400        1.0508             nan     0.0010    0.0002
##    420        1.0409             nan     0.0010    0.0002
##    440        1.0312             nan     0.0010    0.0002
##    460        1.0218             nan     0.0010    0.0002
##    480        1.0126             nan     0.0010    0.0002
##    500        1.0038             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0042
##      2        1.3045             nan     0.0100    0.0034
##      3        1.2965             nan     0.0100    0.0038
##      4        1.2885             nan     0.0100    0.0039
##      5        1.2806             nan     0.0100    0.0038
##      6        1.2725             nan     0.0100    0.0037
##      7        1.2649             nan     0.0100    0.0038
##      8        1.2574             nan     0.0100    0.0034
##      9        1.2498             nan     0.0100    0.0036
##     10        1.2426             nan     0.0100    0.0033
##     20        1.1760             nan     0.0100    0.0027
##     40        1.0706             nan     0.0100    0.0019
##     60        0.9896             nan     0.0100    0.0012
##     80        0.9270             nan     0.0100    0.0011
##    100        0.8769             nan     0.0100    0.0007
##    120        0.8356             nan     0.0100    0.0008
##    140        0.8029             nan     0.0100    0.0005
##    160        0.7756             nan     0.0100    0.0003
##    180        0.7504             nan     0.0100    0.0004
##    200        0.7295             nan     0.0100    0.0003
##    220        0.7101             nan     0.0100    0.0002
##    240        0.6923             nan     0.0100    0.0001
##    260        0.6769             nan     0.0100    0.0001
##    280        0.6620             nan     0.0100    0.0000
##    300        0.6484             nan     0.0100    0.0001
##    320        0.6356             nan     0.0100   -0.0001
##    340        0.6242             nan     0.0100    0.0001
##    360        0.6137             nan     0.0100   -0.0000
##    380        0.6022             nan     0.0100    0.0001
##    400        0.5920             nan     0.0100    0.0000
##    420        0.5818             nan     0.0100    0.0000
##    440        0.5727             nan     0.0100   -0.0000
##    460        0.5629             nan     0.0100   -0.0000
##    480        0.5541             nan     0.0100   -0.0001
##    500        0.5451             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0041
##      2        1.3043             nan     0.0100    0.0037
##      3        1.2957             nan     0.0100    0.0041
##      4        1.2875             nan     0.0100    0.0040
##      5        1.2789             nan     0.0100    0.0036
##      6        1.2718             nan     0.0100    0.0036
##      7        1.2637             nan     0.0100    0.0035
##      8        1.2558             nan     0.0100    0.0037
##      9        1.2487             nan     0.0100    0.0032
##     10        1.2409             nan     0.0100    0.0032
##     20        1.1747             nan     0.0100    0.0028
##     40        1.0698             nan     0.0100    0.0017
##     60        0.9923             nan     0.0100    0.0012
##     80        0.9293             nan     0.0100    0.0011
##    100        0.8789             nan     0.0100    0.0008
##    120        0.8392             nan     0.0100    0.0006
##    140        0.8054             nan     0.0100    0.0005
##    160        0.7779             nan     0.0100    0.0005
##    180        0.7537             nan     0.0100    0.0002
##    200        0.7324             nan     0.0100    0.0002
##    220        0.7128             nan     0.0100    0.0003
##    240        0.6950             nan     0.0100    0.0002
##    260        0.6794             nan     0.0100    0.0001
##    280        0.6645             nan     0.0100   -0.0000
##    300        0.6512             nan     0.0100   -0.0001
##    320        0.6392             nan     0.0100   -0.0000
##    340        0.6281             nan     0.0100   -0.0001
##    360        0.6179             nan     0.0100   -0.0002
##    380        0.6080             nan     0.0100    0.0001
##    400        0.5985             nan     0.0100   -0.0000
##    420        0.5885             nan     0.0100    0.0000
##    440        0.5804             nan     0.0100   -0.0000
##    460        0.5710             nan     0.0100   -0.0000
##    480        0.5625             nan     0.0100   -0.0001
##    500        0.5538             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0040
##      2        1.3053             nan     0.0100    0.0032
##      3        1.2967             nan     0.0100    0.0040
##      4        1.2885             nan     0.0100    0.0038
##      5        1.2810             nan     0.0100    0.0039
##      6        1.2746             nan     0.0100    0.0029
##      7        1.2663             nan     0.0100    0.0039
##      8        1.2591             nan     0.0100    0.0032
##      9        1.2519             nan     0.0100    0.0034
##     10        1.2451             nan     0.0100    0.0034
##     20        1.1774             nan     0.0100    0.0029
##     40        1.0736             nan     0.0100    0.0019
##     60        0.9958             nan     0.0100    0.0015
##     80        0.9343             nan     0.0100    0.0012
##    100        0.8835             nan     0.0100    0.0008
##    120        0.8420             nan     0.0100    0.0006
##    140        0.8087             nan     0.0100    0.0004
##    160        0.7804             nan     0.0100    0.0005
##    180        0.7556             nan     0.0100    0.0003
##    200        0.7350             nan     0.0100    0.0002
##    220        0.7164             nan     0.0100   -0.0001
##    240        0.6994             nan     0.0100    0.0001
##    260        0.6843             nan     0.0100    0.0001
##    280        0.6703             nan     0.0100   -0.0000
##    300        0.6577             nan     0.0100    0.0001
##    320        0.6454             nan     0.0100   -0.0000
##    340        0.6343             nan     0.0100   -0.0001
##    360        0.6238             nan     0.0100   -0.0000
##    380        0.6141             nan     0.0100   -0.0001
##    400        0.6047             nan     0.0100   -0.0001
##    420        0.5955             nan     0.0100   -0.0001
##    440        0.5874             nan     0.0100   -0.0001
##    460        0.5792             nan     0.0100   -0.0001
##    480        0.5704             nan     0.0100   -0.0000
##    500        0.5622             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0043
##      2        1.3032             nan     0.0100    0.0042
##      3        1.2940             nan     0.0100    0.0042
##      4        1.2854             nan     0.0100    0.0037
##      5        1.2768             nan     0.0100    0.0039
##      6        1.2687             nan     0.0100    0.0036
##      7        1.2607             nan     0.0100    0.0036
##      8        1.2521             nan     0.0100    0.0036
##      9        1.2440             nan     0.0100    0.0038
##     10        1.2368             nan     0.0100    0.0032
##     20        1.1677             nan     0.0100    0.0029
##     40        1.0560             nan     0.0100    0.0021
##     60        0.9719             nan     0.0100    0.0015
##     80        0.9060             nan     0.0100    0.0013
##    100        0.8530             nan     0.0100    0.0008
##    120        0.8099             nan     0.0100    0.0005
##    140        0.7725             nan     0.0100    0.0006
##    160        0.7416             nan     0.0100    0.0005
##    180        0.7160             nan     0.0100    0.0004
##    200        0.6928             nan     0.0100    0.0001
##    220        0.6723             nan     0.0100    0.0002
##    240        0.6524             nan     0.0100    0.0002
##    260        0.6345             nan     0.0100    0.0002
##    280        0.6183             nan     0.0100    0.0002
##    300        0.6039             nan     0.0100    0.0000
##    320        0.5912             nan     0.0100    0.0001
##    340        0.5775             nan     0.0100    0.0001
##    360        0.5645             nan     0.0100    0.0001
##    380        0.5531             nan     0.0100   -0.0001
##    400        0.5416             nan     0.0100    0.0001
##    420        0.5316             nan     0.0100    0.0000
##    440        0.5210             nan     0.0100   -0.0001
##    460        0.5116             nan     0.0100    0.0000
##    480        0.5027             nan     0.0100   -0.0001
##    500        0.4934             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3128             nan     0.0100    0.0040
##      2        1.3032             nan     0.0100    0.0039
##      3        1.2943             nan     0.0100    0.0042
##      4        1.2857             nan     0.0100    0.0041
##      5        1.2782             nan     0.0100    0.0036
##      6        1.2695             nan     0.0100    0.0038
##      7        1.2618             nan     0.0100    0.0035
##      8        1.2546             nan     0.0100    0.0032
##      9        1.2470             nan     0.0100    0.0033
##     10        1.2389             nan     0.0100    0.0035
##     20        1.1698             nan     0.0100    0.0027
##     40        1.0581             nan     0.0100    0.0018
##     60        0.9751             nan     0.0100    0.0013
##     80        0.9095             nan     0.0100    0.0009
##    100        0.8577             nan     0.0100    0.0009
##    120        0.8135             nan     0.0100    0.0006
##    140        0.7787             nan     0.0100    0.0005
##    160        0.7485             nan     0.0100    0.0002
##    180        0.7225             nan     0.0100    0.0002
##    200        0.6997             nan     0.0100    0.0002
##    220        0.6798             nan     0.0100    0.0000
##    240        0.6623             nan     0.0100    0.0001
##    260        0.6447             nan     0.0100    0.0000
##    280        0.6292             nan     0.0100    0.0002
##    300        0.6146             nan     0.0100   -0.0001
##    320        0.6006             nan     0.0100    0.0001
##    340        0.5884             nan     0.0100   -0.0000
##    360        0.5753             nan     0.0100   -0.0001
##    380        0.5641             nan     0.0100    0.0000
##    400        0.5530             nan     0.0100    0.0000
##    420        0.5422             nan     0.0100   -0.0002
##    440        0.5317             nan     0.0100   -0.0001
##    460        0.5219             nan     0.0100   -0.0001
##    480        0.5120             nan     0.0100   -0.0000
##    500        0.5031             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0042
##      2        1.3026             nan     0.0100    0.0041
##      3        1.2939             nan     0.0100    0.0040
##      4        1.2849             nan     0.0100    0.0043
##      5        1.2771             nan     0.0100    0.0037
##      6        1.2698             nan     0.0100    0.0032
##      7        1.2611             nan     0.0100    0.0037
##      8        1.2530             nan     0.0100    0.0034
##      9        1.2455             nan     0.0100    0.0036
##     10        1.2380             nan     0.0100    0.0031
##     20        1.1684             nan     0.0100    0.0029
##     40        1.0594             nan     0.0100    0.0020
##     60        0.9764             nan     0.0100    0.0015
##     80        0.9130             nan     0.0100    0.0011
##    100        0.8613             nan     0.0100    0.0009
##    120        0.8184             nan     0.0100    0.0007
##    140        0.7828             nan     0.0100    0.0004
##    160        0.7532             nan     0.0100    0.0005
##    180        0.7259             nan     0.0100    0.0003
##    200        0.7030             nan     0.0100    0.0002
##    220        0.6827             nan     0.0100   -0.0000
##    240        0.6645             nan     0.0100    0.0003
##    260        0.6478             nan     0.0100    0.0002
##    280        0.6327             nan     0.0100    0.0002
##    300        0.6186             nan     0.0100    0.0002
##    320        0.6056             nan     0.0100   -0.0001
##    340        0.5928             nan     0.0100    0.0002
##    360        0.5804             nan     0.0100   -0.0000
##    380        0.5695             nan     0.0100    0.0001
##    400        0.5583             nan     0.0100   -0.0001
##    420        0.5481             nan     0.0100   -0.0001
##    440        0.5387             nan     0.0100    0.0001
##    460        0.5295             nan     0.0100   -0.0001
##    480        0.5199             nan     0.0100    0.0000
##    500        0.5109             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0044
##      2        1.3023             nan     0.0100    0.0046
##      3        1.2929             nan     0.0100    0.0042
##      4        1.2837             nan     0.0100    0.0039
##      5        1.2750             nan     0.0100    0.0038
##      6        1.2670             nan     0.0100    0.0037
##      7        1.2587             nan     0.0100    0.0039
##      8        1.2504             nan     0.0100    0.0035
##      9        1.2423             nan     0.0100    0.0034
##     10        1.2341             nan     0.0100    0.0035
##     20        1.1606             nan     0.0100    0.0032
##     40        1.0448             nan     0.0100    0.0022
##     60        0.9571             nan     0.0100    0.0018
##     80        0.8883             nan     0.0100    0.0013
##    100        0.8307             nan     0.0100    0.0007
##    120        0.7847             nan     0.0100    0.0006
##    140        0.7456             nan     0.0100    0.0003
##    160        0.7142             nan     0.0100    0.0002
##    180        0.6857             nan     0.0100    0.0003
##    200        0.6599             nan     0.0100    0.0004
##    220        0.6358             nan     0.0100    0.0001
##    240        0.6156             nan     0.0100   -0.0001
##    260        0.5974             nan     0.0100    0.0001
##    280        0.5803             nan     0.0100    0.0002
##    300        0.5648             nan     0.0100   -0.0000
##    320        0.5506             nan     0.0100    0.0001
##    340        0.5356             nan     0.0100    0.0000
##    360        0.5230             nan     0.0100    0.0000
##    380        0.5099             nan     0.0100   -0.0001
##    400        0.4980             nan     0.0100    0.0000
##    420        0.4868             nan     0.0100   -0.0001
##    440        0.4748             nan     0.0100    0.0000
##    460        0.4642             nan     0.0100    0.0001
##    480        0.4545             nan     0.0100   -0.0001
##    500        0.4450             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0043
##      2        1.3017             nan     0.0100    0.0043
##      3        1.2928             nan     0.0100    0.0040
##      4        1.2838             nan     0.0100    0.0038
##      5        1.2747             nan     0.0100    0.0040
##      6        1.2661             nan     0.0100    0.0035
##      7        1.2583             nan     0.0100    0.0033
##      8        1.2496             nan     0.0100    0.0040
##      9        1.2423             nan     0.0100    0.0029
##     10        1.2346             nan     0.0100    0.0035
##     20        1.1619             nan     0.0100    0.0032
##     40        1.0448             nan     0.0100    0.0021
##     60        0.9579             nan     0.0100    0.0015
##     80        0.8892             nan     0.0100    0.0011
##    100        0.8353             nan     0.0100    0.0010
##    120        0.7923             nan     0.0100    0.0005
##    140        0.7546             nan     0.0100    0.0003
##    160        0.7214             nan     0.0100    0.0003
##    180        0.6943             nan     0.0100    0.0002
##    200        0.6715             nan     0.0100   -0.0000
##    220        0.6487             nan     0.0100    0.0003
##    240        0.6283             nan     0.0100    0.0001
##    260        0.6099             nan     0.0100    0.0000
##    280        0.5919             nan     0.0100    0.0000
##    300        0.5761             nan     0.0100   -0.0001
##    320        0.5623             nan     0.0100    0.0001
##    340        0.5492             nan     0.0100   -0.0002
##    360        0.5367             nan     0.0100    0.0000
##    380        0.5247             nan     0.0100    0.0001
##    400        0.5126             nan     0.0100   -0.0000
##    420        0.5011             nan     0.0100   -0.0000
##    440        0.4893             nan     0.0100    0.0000
##    460        0.4784             nan     0.0100    0.0000
##    480        0.4685             nan     0.0100   -0.0000
##    500        0.4582             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0041
##      2        1.3025             nan     0.0100    0.0037
##      3        1.2931             nan     0.0100    0.0041
##      4        1.2836             nan     0.0100    0.0040
##      5        1.2751             nan     0.0100    0.0040
##      6        1.2663             nan     0.0100    0.0037
##      7        1.2577             nan     0.0100    0.0038
##      8        1.2491             nan     0.0100    0.0034
##      9        1.2421             nan     0.0100    0.0035
##     10        1.2346             nan     0.0100    0.0032
##     20        1.1655             nan     0.0100    0.0031
##     40        1.0528             nan     0.0100    0.0017
##     60        0.9682             nan     0.0100    0.0015
##     80        0.9009             nan     0.0100    0.0010
##    100        0.8465             nan     0.0100    0.0009
##    120        0.8018             nan     0.0100    0.0008
##    140        0.7645             nan     0.0100    0.0007
##    160        0.7339             nan     0.0100    0.0002
##    180        0.7057             nan     0.0100    0.0003
##    200        0.6837             nan     0.0100    0.0001
##    220        0.6621             nan     0.0100    0.0002
##    240        0.6422             nan     0.0100    0.0002
##    260        0.6229             nan     0.0100    0.0001
##    280        0.6058             nan     0.0100    0.0002
##    300        0.5904             nan     0.0100   -0.0000
##    320        0.5759             nan     0.0100    0.0001
##    340        0.5625             nan     0.0100    0.0001
##    360        0.5486             nan     0.0100    0.0000
##    380        0.5373             nan     0.0100   -0.0001
##    400        0.5260             nan     0.0100   -0.0002
##    420        0.5149             nan     0.0100   -0.0000
##    440        0.5041             nan     0.0100   -0.0001
##    460        0.4937             nan     0.0100    0.0000
##    480        0.4835             nan     0.0100   -0.0001
##    500        0.4746             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2389             nan     0.1000    0.0358
##      2        1.1752             nan     0.1000    0.0304
##      3        1.1165             nan     0.1000    0.0247
##      4        1.0662             nan     0.1000    0.0218
##      5        1.0242             nan     0.1000    0.0166
##      6        0.9898             nan     0.1000    0.0150
##      7        0.9526             nan     0.1000    0.0168
##      8        0.9264             nan     0.1000    0.0088
##      9        0.9043             nan     0.1000    0.0061
##     10        0.8783             nan     0.1000    0.0084
##     20        0.7208             nan     0.1000    0.0031
##     40        0.6002             nan     0.1000   -0.0002
##     60        0.5127             nan     0.1000   -0.0022
##     80        0.4498             nan     0.1000   -0.0006
##    100        0.3956             nan     0.1000   -0.0011
##    120        0.3558             nan     0.1000   -0.0003
##    140        0.3185             nan     0.1000   -0.0009
##    160        0.2824             nan     0.1000   -0.0000
##    180        0.2536             nan     0.1000   -0.0004
##    200        0.2282             nan     0.1000   -0.0003
##    220        0.2066             nan     0.1000   -0.0004
##    240        0.1874             nan     0.1000   -0.0007
##    260        0.1691             nan     0.1000    0.0002
##    280        0.1532             nan     0.1000   -0.0001
##    300        0.1377             nan     0.1000   -0.0008
##    320        0.1275             nan     0.1000   -0.0006
##    340        0.1176             nan     0.1000   -0.0004
##    360        0.1070             nan     0.1000    0.0001
##    380        0.0968             nan     0.1000   -0.0001
##    400        0.0890             nan     0.1000   -0.0001
##    420        0.0817             nan     0.1000   -0.0002
##    440        0.0763             nan     0.1000   -0.0003
##    460        0.0702             nan     0.1000   -0.0000
##    480        0.0644             nan     0.1000   -0.0001
##    500        0.0591             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2360             nan     0.1000    0.0403
##      2        1.1692             nan     0.1000    0.0316
##      3        1.1088             nan     0.1000    0.0276
##      4        1.0610             nan     0.1000    0.0206
##      5        1.0175             nan     0.1000    0.0174
##      6        0.9834             nan     0.1000    0.0133
##      7        0.9494             nan     0.1000    0.0127
##      8        0.9213             nan     0.1000    0.0118
##      9        0.8969             nan     0.1000    0.0088
##     10        0.8750             nan     0.1000    0.0070
##     20        0.7253             nan     0.1000    0.0017
##     40        0.6038             nan     0.1000   -0.0022
##     60        0.5170             nan     0.1000    0.0008
##     80        0.4486             nan     0.1000   -0.0001
##    100        0.3993             nan     0.1000   -0.0011
##    120        0.3527             nan     0.1000    0.0001
##    140        0.3155             nan     0.1000   -0.0005
##    160        0.2876             nan     0.1000   -0.0007
##    180        0.2599             nan     0.1000   -0.0006
##    200        0.2331             nan     0.1000    0.0003
##    220        0.2102             nan     0.1000   -0.0008
##    240        0.1895             nan     0.1000   -0.0010
##    260        0.1719             nan     0.1000   -0.0003
##    280        0.1560             nan     0.1000   -0.0009
##    300        0.1423             nan     0.1000   -0.0006
##    320        0.1317             nan     0.1000   -0.0003
##    340        0.1209             nan     0.1000    0.0002
##    360        0.1107             nan     0.1000   -0.0004
##    380        0.1026             nan     0.1000   -0.0004
##    400        0.0950             nan     0.1000   -0.0001
##    420        0.0884             nan     0.1000   -0.0004
##    440        0.0806             nan     0.1000   -0.0002
##    460        0.0754             nan     0.1000   -0.0001
##    480        0.0686             nan     0.1000   -0.0001
##    500        0.0632             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2346             nan     0.1000    0.0369
##      2        1.1697             nan     0.1000    0.0290
##      3        1.1128             nan     0.1000    0.0241
##      4        1.0638             nan     0.1000    0.0226
##      5        1.0218             nan     0.1000    0.0175
##      6        0.9861             nan     0.1000    0.0143
##      7        0.9521             nan     0.1000    0.0124
##      8        0.9258             nan     0.1000    0.0095
##      9        0.8981             nan     0.1000    0.0091
##     10        0.8749             nan     0.1000    0.0090
##     20        0.7392             nan     0.1000    0.0024
##     40        0.6119             nan     0.1000   -0.0007
##     60        0.5296             nan     0.1000   -0.0017
##     80        0.4699             nan     0.1000   -0.0001
##    100        0.4139             nan     0.1000   -0.0004
##    120        0.3705             nan     0.1000   -0.0023
##    140        0.3327             nan     0.1000   -0.0013
##    160        0.3025             nan     0.1000   -0.0002
##    180        0.2704             nan     0.1000   -0.0008
##    200        0.2490             nan     0.1000   -0.0005
##    220        0.2278             nan     0.1000   -0.0006
##    240        0.2109             nan     0.1000   -0.0008
##    260        0.1919             nan     0.1000   -0.0006
##    280        0.1748             nan     0.1000   -0.0001
##    300        0.1602             nan     0.1000   -0.0004
##    320        0.1469             nan     0.1000   -0.0006
##    340        0.1342             nan     0.1000   -0.0001
##    360        0.1241             nan     0.1000   -0.0001
##    380        0.1147             nan     0.1000   -0.0001
##    400        0.1061             nan     0.1000   -0.0004
##    420        0.0976             nan     0.1000   -0.0000
##    440        0.0903             nan     0.1000   -0.0004
##    460        0.0836             nan     0.1000   -0.0004
##    480        0.0777             nan     0.1000   -0.0005
##    500        0.0723             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2296             nan     0.1000    0.0412
##      2        1.1665             nan     0.1000    0.0269
##      3        1.1054             nan     0.1000    0.0272
##      4        1.0542             nan     0.1000    0.0220
##      5        1.0131             nan     0.1000    0.0180
##      6        0.9733             nan     0.1000    0.0165
##      7        0.9392             nan     0.1000    0.0157
##      8        0.9077             nan     0.1000    0.0120
##      9        0.8811             nan     0.1000    0.0089
##     10        0.8576             nan     0.1000    0.0058
##     20        0.7013             nan     0.1000    0.0027
##     40        0.5504             nan     0.1000    0.0009
##     60        0.4662             nan     0.1000   -0.0018
##     80        0.3911             nan     0.1000    0.0001
##    100        0.3401             nan     0.1000   -0.0016
##    120        0.2943             nan     0.1000   -0.0000
##    140        0.2581             nan     0.1000   -0.0003
##    160        0.2252             nan     0.1000    0.0004
##    180        0.1982             nan     0.1000   -0.0007
##    200        0.1765             nan     0.1000   -0.0007
##    220        0.1544             nan     0.1000   -0.0001
##    240        0.1374             nan     0.1000   -0.0001
##    260        0.1229             nan     0.1000   -0.0003
##    280        0.1072             nan     0.1000   -0.0003
##    300        0.0966             nan     0.1000   -0.0003
##    320        0.0864             nan     0.1000   -0.0002
##    340        0.0783             nan     0.1000   -0.0003
##    360        0.0705             nan     0.1000   -0.0003
##    380        0.0633             nan     0.1000   -0.0001
##    400        0.0581             nan     0.1000   -0.0002
##    420        0.0522             nan     0.1000   -0.0002
##    440        0.0468             nan     0.1000   -0.0001
##    460        0.0429             nan     0.1000   -0.0002
##    480        0.0386             nan     0.1000   -0.0001
##    500        0.0352             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2346             nan     0.1000    0.0400
##      2        1.1695             nan     0.1000    0.0277
##      3        1.1041             nan     0.1000    0.0260
##      4        1.0580             nan     0.1000    0.0202
##      5        1.0139             nan     0.1000    0.0201
##      6        0.9768             nan     0.1000    0.0160
##      7        0.9415             nan     0.1000    0.0137
##      8        0.9154             nan     0.1000    0.0083
##      9        0.8874             nan     0.1000    0.0112
##     10        0.8623             nan     0.1000    0.0102
##     20        0.6988             nan     0.1000    0.0023
##     40        0.5501             nan     0.1000    0.0008
##     60        0.4619             nan     0.1000   -0.0010
##     80        0.3922             nan     0.1000    0.0003
##    100        0.3391             nan     0.1000   -0.0003
##    120        0.2990             nan     0.1000   -0.0001
##    140        0.2584             nan     0.1000   -0.0008
##    160        0.2282             nan     0.1000   -0.0006
##    180        0.2062             nan     0.1000   -0.0005
##    200        0.1815             nan     0.1000   -0.0004
##    220        0.1602             nan     0.1000   -0.0002
##    240        0.1411             nan     0.1000   -0.0009
##    260        0.1256             nan     0.1000   -0.0004
##    280        0.1133             nan     0.1000   -0.0005
##    300        0.1013             nan     0.1000   -0.0004
##    320        0.0901             nan     0.1000   -0.0005
##    340        0.0807             nan     0.1000   -0.0004
##    360        0.0715             nan     0.1000   -0.0001
##    380        0.0644             nan     0.1000   -0.0001
##    400        0.0585             nan     0.1000   -0.0003
##    420        0.0521             nan     0.1000   -0.0002
##    440        0.0476             nan     0.1000   -0.0003
##    460        0.0432             nan     0.1000   -0.0001
##    480        0.0389             nan     0.1000   -0.0002
##    500        0.0358             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2411             nan     0.1000    0.0347
##      2        1.1683             nan     0.1000    0.0303
##      3        1.1046             nan     0.1000    0.0252
##      4        1.0551             nan     0.1000    0.0200
##      5        1.0058             nan     0.1000    0.0218
##      6        0.9703             nan     0.1000    0.0149
##      7        0.9344             nan     0.1000    0.0126
##      8        0.9039             nan     0.1000    0.0122
##      9        0.8747             nan     0.1000    0.0106
##     10        0.8524             nan     0.1000    0.0089
##     20        0.6977             nan     0.1000    0.0040
##     40        0.5655             nan     0.1000   -0.0010
##     60        0.4718             nan     0.1000    0.0001
##     80        0.4061             nan     0.1000   -0.0006
##    100        0.3485             nan     0.1000   -0.0006
##    120        0.3049             nan     0.1000   -0.0015
##    140        0.2671             nan     0.1000   -0.0009
##    160        0.2351             nan     0.1000    0.0001
##    180        0.2078             nan     0.1000   -0.0007
##    200        0.1830             nan     0.1000   -0.0005
##    220        0.1620             nan     0.1000   -0.0003
##    240        0.1428             nan     0.1000   -0.0005
##    260        0.1277             nan     0.1000   -0.0003
##    280        0.1141             nan     0.1000   -0.0002
##    300        0.1031             nan     0.1000   -0.0003
##    320        0.0923             nan     0.1000   -0.0003
##    340        0.0833             nan     0.1000   -0.0004
##    360        0.0752             nan     0.1000   -0.0003
##    380        0.0678             nan     0.1000   -0.0002
##    400        0.0618             nan     0.1000   -0.0002
##    420        0.0562             nan     0.1000   -0.0002
##    440        0.0506             nan     0.1000   -0.0002
##    460        0.0458             nan     0.1000   -0.0003
##    480        0.0416             nan     0.1000   -0.0002
##    500        0.0379             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2297             nan     0.1000    0.0387
##      2        1.1539             nan     0.1000    0.0296
##      3        1.0902             nan     0.1000    0.0275
##      4        1.0385             nan     0.1000    0.0241
##      5        0.9909             nan     0.1000    0.0188
##      6        0.9499             nan     0.1000    0.0175
##      7        0.9138             nan     0.1000    0.0134
##      8        0.8834             nan     0.1000    0.0112
##      9        0.8534             nan     0.1000    0.0079
##     10        0.8249             nan     0.1000    0.0115
##     20        0.6683             nan     0.1000   -0.0009
##     40        0.5142             nan     0.1000   -0.0010
##     60        0.4132             nan     0.1000    0.0010
##     80        0.3404             nan     0.1000   -0.0004
##    100        0.2777             nan     0.1000    0.0003
##    120        0.2316             nan     0.1000   -0.0009
##    140        0.1921             nan     0.1000   -0.0001
##    160        0.1633             nan     0.1000   -0.0004
##    180        0.1407             nan     0.1000   -0.0002
##    200        0.1208             nan     0.1000   -0.0002
##    220        0.1051             nan     0.1000   -0.0002
##    240        0.0925             nan     0.1000   -0.0002
##    260        0.0811             nan     0.1000   -0.0002
##    280        0.0698             nan     0.1000   -0.0001
##    300        0.0614             nan     0.1000   -0.0000
##    320        0.0545             nan     0.1000   -0.0002
##    340        0.0479             nan     0.1000    0.0000
##    360        0.0425             nan     0.1000   -0.0001
##    380        0.0377             nan     0.1000   -0.0001
##    400        0.0331             nan     0.1000   -0.0001
##    420        0.0296             nan     0.1000   -0.0001
##    440        0.0263             nan     0.1000   -0.0000
##    460        0.0231             nan     0.1000   -0.0000
##    480        0.0206             nan     0.1000   -0.0001
##    500        0.0183             nan     0.1000    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2332             nan     0.1000    0.0397
##      2        1.1515             nan     0.1000    0.0355
##      3        1.0871             nan     0.1000    0.0321
##      4        1.0350             nan     0.1000    0.0234
##      5        0.9874             nan     0.1000    0.0200
##      6        0.9496             nan     0.1000    0.0161
##      7        0.9155             nan     0.1000    0.0123
##      8        0.8798             nan     0.1000    0.0133
##      9        0.8503             nan     0.1000    0.0102
##     10        0.8247             nan     0.1000    0.0076
##     20        0.6626             nan     0.1000    0.0014
##     40        0.5103             nan     0.1000    0.0002
##     60        0.4190             nan     0.1000   -0.0011
##     80        0.3520             nan     0.1000   -0.0001
##    100        0.2921             nan     0.1000    0.0003
##    120        0.2461             nan     0.1000   -0.0006
##    140        0.2080             nan     0.1000   -0.0011
##    160        0.1774             nan     0.1000   -0.0005
##    180        0.1512             nan     0.1000   -0.0001
##    200        0.1289             nan     0.1000   -0.0005
##    220        0.1112             nan     0.1000   -0.0004
##    240        0.0973             nan     0.1000   -0.0006
##    260        0.0846             nan     0.1000   -0.0001
##    280        0.0743             nan     0.1000   -0.0003
##    300        0.0649             nan     0.1000   -0.0004
##    320        0.0568             nan     0.1000   -0.0001
##    340        0.0498             nan     0.1000    0.0000
##    360        0.0442             nan     0.1000   -0.0000
##    380        0.0392             nan     0.1000   -0.0001
##    400        0.0348             nan     0.1000   -0.0001
##    420        0.0305             nan     0.1000   -0.0001
##    440        0.0269             nan     0.1000   -0.0001
##    460        0.0236             nan     0.1000   -0.0001
##    480        0.0208             nan     0.1000   -0.0000
##    500        0.0184             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2325             nan     0.1000    0.0381
##      2        1.1601             nan     0.1000    0.0322
##      3        1.0948             nan     0.1000    0.0288
##      4        1.0455             nan     0.1000    0.0211
##      5        1.0002             nan     0.1000    0.0195
##      6        0.9634             nan     0.1000    0.0144
##      7        0.9291             nan     0.1000    0.0121
##      8        0.8974             nan     0.1000    0.0118
##      9        0.8719             nan     0.1000    0.0094
##     10        0.8458             nan     0.1000    0.0108
##     20        0.6820             nan     0.1000    0.0032
##     40        0.5295             nan     0.1000    0.0008
##     60        0.4409             nan     0.1000   -0.0015
##     80        0.3730             nan     0.1000   -0.0010
##    100        0.3152             nan     0.1000   -0.0013
##    120        0.2679             nan     0.1000    0.0002
##    140        0.2293             nan     0.1000   -0.0008
##    160        0.1970             nan     0.1000   -0.0010
##    180        0.1722             nan     0.1000   -0.0003
##    200        0.1504             nan     0.1000   -0.0009
##    220        0.1310             nan     0.1000   -0.0007
##    240        0.1151             nan     0.1000   -0.0006
##    260        0.1011             nan     0.1000   -0.0004
##    280        0.0884             nan     0.1000   -0.0004
##    300        0.0775             nan     0.1000   -0.0005
##    320        0.0687             nan     0.1000   -0.0001
##    340        0.0617             nan     0.1000   -0.0002
##    360        0.0545             nan     0.1000   -0.0002
##    380        0.0474             nan     0.1000   -0.0002
##    400        0.0423             nan     0.1000   -0.0002
##    420        0.0374             nan     0.1000   -0.0002
##    440        0.0332             nan     0.1000   -0.0001
##    460        0.0294             nan     0.1000   -0.0002
##    480        0.0260             nan     0.1000   -0.0001
##    500        0.0232             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0005
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3125             nan     0.0010    0.0004
##     20        1.3036             nan     0.0010    0.0004
##     40        1.2863             nan     0.0010    0.0004
##     60        1.2697             nan     0.0010    0.0004
##     80        1.2534             nan     0.0010    0.0003
##    100        1.2383             nan     0.0010    0.0004
##    120        1.2233             nan     0.0010    0.0003
##    140        1.2088             nan     0.0010    0.0003
##    160        1.1944             nan     0.0010    0.0003
##    180        1.1809             nan     0.0010    0.0003
##    200        1.1676             nan     0.0010    0.0003
##    220        1.1548             nan     0.0010    0.0003
##    240        1.1424             nan     0.0010    0.0003
##    260        1.1305             nan     0.0010    0.0002
##    280        1.1187             nan     0.0010    0.0003
##    300        1.1072             nan     0.0010    0.0003
##    320        1.0964             nan     0.0010    0.0002
##    340        1.0856             nan     0.0010    0.0002
##    360        1.0751             nan     0.0010    0.0002
##    380        1.0646             nan     0.0010    0.0002
##    400        1.0544             nan     0.0010    0.0002
##    420        1.0446             nan     0.0010    0.0002
##    440        1.0351             nan     0.0010    0.0002
##    460        1.0260             nan     0.0010    0.0002
##    480        1.0172             nan     0.0010    0.0002
##    500        1.0083             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0003
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2868             nan     0.0010    0.0004
##     60        1.2701             nan     0.0010    0.0004
##     80        1.2542             nan     0.0010    0.0004
##    100        1.2386             nan     0.0010    0.0004
##    120        1.2235             nan     0.0010    0.0003
##    140        1.2090             nan     0.0010    0.0003
##    160        1.1949             nan     0.0010    0.0003
##    180        1.1815             nan     0.0010    0.0003
##    200        1.1683             nan     0.0010    0.0003
##    220        1.1556             nan     0.0010    0.0002
##    240        1.1435             nan     0.0010    0.0003
##    260        1.1313             nan     0.0010    0.0003
##    280        1.1196             nan     0.0010    0.0003
##    300        1.1082             nan     0.0010    0.0002
##    320        1.0970             nan     0.0010    0.0002
##    340        1.0861             nan     0.0010    0.0002
##    360        1.0758             nan     0.0010    0.0002
##    380        1.0654             nan     0.0010    0.0002
##    400        1.0556             nan     0.0010    0.0002
##    420        1.0460             nan     0.0010    0.0002
##    440        1.0364             nan     0.0010    0.0002
##    460        1.0273             nan     0.0010    0.0002
##    480        1.0183             nan     0.0010    0.0002
##    500        1.0095             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0005
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0003
##      9        1.3132             nan     0.0010    0.0003
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0004
##     80        1.2542             nan     0.0010    0.0004
##    100        1.2391             nan     0.0010    0.0003
##    120        1.2247             nan     0.0010    0.0003
##    140        1.2101             nan     0.0010    0.0004
##    160        1.1962             nan     0.0010    0.0003
##    180        1.1826             nan     0.0010    0.0003
##    200        1.1695             nan     0.0010    0.0003
##    220        1.1568             nan     0.0010    0.0003
##    240        1.1444             nan     0.0010    0.0003
##    260        1.1324             nan     0.0010    0.0003
##    280        1.1208             nan     0.0010    0.0003
##    300        1.1095             nan     0.0010    0.0003
##    320        1.0987             nan     0.0010    0.0002
##    340        1.0879             nan     0.0010    0.0002
##    360        1.0777             nan     0.0010    0.0002
##    380        1.0673             nan     0.0010    0.0002
##    400        1.0575             nan     0.0010    0.0002
##    420        1.0476             nan     0.0010    0.0002
##    440        1.0384             nan     0.0010    0.0001
##    460        1.0292             nan     0.0010    0.0002
##    480        1.0205             nan     0.0010    0.0002
##    500        1.0119             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0004
##     40        1.2840             nan     0.0010    0.0005
##     60        1.2667             nan     0.0010    0.0004
##     80        1.2495             nan     0.0010    0.0003
##    100        1.2327             nan     0.0010    0.0004
##    120        1.2169             nan     0.0010    0.0004
##    140        1.2012             nan     0.0010    0.0004
##    160        1.1862             nan     0.0010    0.0004
##    180        1.1718             nan     0.0010    0.0003
##    200        1.1580             nan     0.0010    0.0003
##    220        1.1443             nan     0.0010    0.0003
##    240        1.1313             nan     0.0010    0.0002
##    260        1.1186             nan     0.0010    0.0003
##    280        1.1061             nan     0.0010    0.0003
##    300        1.0941             nan     0.0010    0.0002
##    320        1.0824             nan     0.0010    0.0003
##    340        1.0708             nan     0.0010    0.0003
##    360        1.0598             nan     0.0010    0.0002
##    380        1.0491             nan     0.0010    0.0002
##    400        1.0387             nan     0.0010    0.0002
##    420        1.0285             nan     0.0010    0.0002
##    440        1.0187             nan     0.0010    0.0002
##    460        1.0090             nan     0.0010    0.0002
##    480        0.9996             nan     0.0010    0.0002
##    500        0.9906             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0005
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0005
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2846             nan     0.0010    0.0004
##     60        1.2672             nan     0.0010    0.0004
##     80        1.2503             nan     0.0010    0.0004
##    100        1.2341             nan     0.0010    0.0004
##    120        1.2186             nan     0.0010    0.0003
##    140        1.2034             nan     0.0010    0.0003
##    160        1.1884             nan     0.0010    0.0004
##    180        1.1741             nan     0.0010    0.0003
##    200        1.1601             nan     0.0010    0.0003
##    220        1.1468             nan     0.0010    0.0003
##    240        1.1336             nan     0.0010    0.0003
##    260        1.1208             nan     0.0010    0.0003
##    280        1.1084             nan     0.0010    0.0003
##    300        1.0962             nan     0.0010    0.0003
##    320        1.0848             nan     0.0010    0.0003
##    340        1.0734             nan     0.0010    0.0002
##    360        1.0624             nan     0.0010    0.0002
##    380        1.0518             nan     0.0010    0.0002
##    400        1.0411             nan     0.0010    0.0002
##    420        1.0310             nan     0.0010    0.0002
##    440        1.0211             nan     0.0010    0.0002
##    460        1.0113             nan     0.0010    0.0002
##    480        1.0020             nan     0.0010    0.0002
##    500        0.9930             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0005
##      3        1.3183             nan     0.0010    0.0005
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2506             nan     0.0010    0.0004
##    100        1.2343             nan     0.0010    0.0004
##    120        1.2188             nan     0.0010    0.0003
##    140        1.2041             nan     0.0010    0.0003
##    160        1.1894             nan     0.0010    0.0003
##    180        1.1751             nan     0.0010    0.0003
##    200        1.1610             nan     0.0010    0.0003
##    220        1.1478             nan     0.0010    0.0003
##    240        1.1347             nan     0.0010    0.0003
##    260        1.1221             nan     0.0010    0.0003
##    280        1.1098             nan     0.0010    0.0003
##    300        1.0978             nan     0.0010    0.0003
##    320        1.0861             nan     0.0010    0.0003
##    340        1.0749             nan     0.0010    0.0002
##    360        1.0639             nan     0.0010    0.0002
##    380        1.0535             nan     0.0010    0.0002
##    400        1.0433             nan     0.0010    0.0002
##    420        1.0333             nan     0.0010    0.0002
##    440        1.0237             nan     0.0010    0.0002
##    460        1.0143             nan     0.0010    0.0002
##    480        1.0051             nan     0.0010    0.0002
##    500        0.9961             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3201             nan     0.0010    0.0005
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3181             nan     0.0010    0.0005
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0005
##      6        1.3151             nan     0.0010    0.0005
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0005
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3111             nan     0.0010    0.0004
##     20        1.3010             nan     0.0010    0.0004
##     40        1.2818             nan     0.0010    0.0004
##     60        1.2635             nan     0.0010    0.0004
##     80        1.2461             nan     0.0010    0.0004
##    100        1.2290             nan     0.0010    0.0003
##    120        1.2127             nan     0.0010    0.0004
##    140        1.1967             nan     0.0010    0.0004
##    160        1.1811             nan     0.0010    0.0004
##    180        1.1660             nan     0.0010    0.0003
##    200        1.1511             nan     0.0010    0.0003
##    220        1.1367             nan     0.0010    0.0003
##    240        1.1231             nan     0.0010    0.0003
##    260        1.1098             nan     0.0010    0.0002
##    280        1.0967             nan     0.0010    0.0003
##    300        1.0841             nan     0.0010    0.0003
##    320        1.0721             nan     0.0010    0.0003
##    340        1.0603             nan     0.0010    0.0002
##    360        1.0488             nan     0.0010    0.0003
##    380        1.0375             nan     0.0010    0.0003
##    400        1.0267             nan     0.0010    0.0002
##    420        1.0158             nan     0.0010    0.0003
##    440        1.0056             nan     0.0010    0.0002
##    460        0.9955             nan     0.0010    0.0002
##    480        0.9856             nan     0.0010    0.0002
##    500        0.9760             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0005
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0005
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2828             nan     0.0010    0.0004
##     60        1.2645             nan     0.0010    0.0004
##     80        1.2471             nan     0.0010    0.0004
##    100        1.2303             nan     0.0010    0.0004
##    120        1.2137             nan     0.0010    0.0004
##    140        1.1979             nan     0.0010    0.0004
##    160        1.1823             nan     0.0010    0.0004
##    180        1.1673             nan     0.0010    0.0003
##    200        1.1528             nan     0.0010    0.0003
##    220        1.1387             nan     0.0010    0.0004
##    240        1.1250             nan     0.0010    0.0003
##    260        1.1120             nan     0.0010    0.0003
##    280        1.0993             nan     0.0010    0.0003
##    300        1.0869             nan     0.0010    0.0002
##    320        1.0751             nan     0.0010    0.0003
##    340        1.0631             nan     0.0010    0.0002
##    360        1.0517             nan     0.0010    0.0002
##    380        1.0406             nan     0.0010    0.0002
##    400        1.0299             nan     0.0010    0.0002
##    420        1.0194             nan     0.0010    0.0002
##    440        1.0092             nan     0.0010    0.0002
##    460        0.9992             nan     0.0010    0.0002
##    480        0.9896             nan     0.0010    0.0002
##    500        0.9800             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3201             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0005
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0005
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0005
##     40        1.2837             nan     0.0010    0.0004
##     60        1.2658             nan     0.0010    0.0004
##     80        1.2486             nan     0.0010    0.0004
##    100        1.2317             nan     0.0010    0.0004
##    120        1.2157             nan     0.0010    0.0004
##    140        1.2000             nan     0.0010    0.0003
##    160        1.1847             nan     0.0010    0.0004
##    180        1.1700             nan     0.0010    0.0003
##    200        1.1556             nan     0.0010    0.0003
##    220        1.1418             nan     0.0010    0.0003
##    240        1.1285             nan     0.0010    0.0003
##    260        1.1152             nan     0.0010    0.0003
##    280        1.1026             nan     0.0010    0.0003
##    300        1.0902             nan     0.0010    0.0003
##    320        1.0783             nan     0.0010    0.0003
##    340        1.0665             nan     0.0010    0.0002
##    360        1.0553             nan     0.0010    0.0003
##    380        1.0443             nan     0.0010    0.0002
##    400        1.0337             nan     0.0010    0.0002
##    420        1.0234             nan     0.0010    0.0002
##    440        1.0135             nan     0.0010    0.0002
##    460        1.0035             nan     0.0010    0.0002
##    480        0.9941             nan     0.0010    0.0002
##    500        0.9847             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0043
##      2        1.3031             nan     0.0100    0.0039
##      3        1.2952             nan     0.0100    0.0037
##      4        1.2866             nan     0.0100    0.0038
##      5        1.2784             nan     0.0100    0.0040
##      6        1.2707             nan     0.0100    0.0036
##      7        1.2635             nan     0.0100    0.0034
##      8        1.2550             nan     0.0100    0.0037
##      9        1.2473             nan     0.0100    0.0035
##     10        1.2397             nan     0.0100    0.0039
##     20        1.1680             nan     0.0100    0.0030
##     40        1.0536             nan     0.0100    0.0024
##     60        0.9652             nan     0.0100    0.0015
##     80        0.8977             nan     0.0100    0.0014
##    100        0.8442             nan     0.0100    0.0009
##    120        0.8001             nan     0.0100    0.0010
##    140        0.7647             nan     0.0100    0.0005
##    160        0.7339             nan     0.0100    0.0006
##    180        0.7086             nan     0.0100    0.0003
##    200        0.6862             nan     0.0100    0.0003
##    220        0.6659             nan     0.0100    0.0001
##    240        0.6497             nan     0.0100    0.0002
##    260        0.6337             nan     0.0100    0.0001
##    280        0.6189             nan     0.0100   -0.0001
##    300        0.6046             nan     0.0100    0.0000
##    320        0.5919             nan     0.0100    0.0000
##    340        0.5799             nan     0.0100   -0.0001
##    360        0.5694             nan     0.0100   -0.0000
##    380        0.5599             nan     0.0100   -0.0002
##    400        0.5497             nan     0.0100    0.0001
##    420        0.5415             nan     0.0100    0.0001
##    440        0.5319             nan     0.0100   -0.0000
##    460        0.5241             nan     0.0100    0.0001
##    480        0.5155             nan     0.0100   -0.0000
##    500        0.5073             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0041
##      2        1.3032             nan     0.0100    0.0040
##      3        1.2944             nan     0.0100    0.0040
##      4        1.2857             nan     0.0100    0.0039
##      5        1.2776             nan     0.0100    0.0038
##      6        1.2692             nan     0.0100    0.0036
##      7        1.2612             nan     0.0100    0.0038
##      8        1.2531             nan     0.0100    0.0041
##      9        1.2455             nan     0.0100    0.0036
##     10        1.2374             nan     0.0100    0.0033
##     20        1.1666             nan     0.0100    0.0028
##     40        1.0544             nan     0.0100    0.0021
##     60        0.9686             nan     0.0100    0.0017
##     80        0.9005             nan     0.0100    0.0014
##    100        0.8476             nan     0.0100    0.0010
##    120        0.8027             nan     0.0100    0.0005
##    140        0.7668             nan     0.0100    0.0006
##    160        0.7361             nan     0.0100    0.0005
##    180        0.7109             nan     0.0100    0.0004
##    200        0.6885             nan     0.0100    0.0001
##    220        0.6699             nan     0.0100    0.0001
##    240        0.6525             nan     0.0100    0.0003
##    260        0.6363             nan     0.0100    0.0000
##    280        0.6222             nan     0.0100    0.0000
##    300        0.6088             nan     0.0100    0.0001
##    320        0.5973             nan     0.0100   -0.0000
##    340        0.5864             nan     0.0100    0.0001
##    360        0.5763             nan     0.0100    0.0000
##    380        0.5670             nan     0.0100   -0.0001
##    400        0.5573             nan     0.0100   -0.0000
##    420        0.5484             nan     0.0100   -0.0000
##    440        0.5397             nan     0.0100   -0.0002
##    460        0.5314             nan     0.0100   -0.0000
##    480        0.5229             nan     0.0100    0.0001
##    500        0.5146             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0035
##      2        1.3049             nan     0.0100    0.0036
##      3        1.2961             nan     0.0100    0.0041
##      4        1.2872             nan     0.0100    0.0038
##      5        1.2792             nan     0.0100    0.0037
##      6        1.2704             nan     0.0100    0.0041
##      7        1.2621             nan     0.0100    0.0036
##      8        1.2547             nan     0.0100    0.0034
##      9        1.2472             nan     0.0100    0.0037
##     10        1.2394             nan     0.0100    0.0037
##     20        1.1692             nan     0.0100    0.0026
##     40        1.0600             nan     0.0100    0.0019
##     60        0.9721             nan     0.0100    0.0016
##     80        0.9044             nan     0.0100    0.0010
##    100        0.8512             nan     0.0100    0.0011
##    120        0.8074             nan     0.0100    0.0007
##    140        0.7717             nan     0.0100    0.0003
##    160        0.7415             nan     0.0100    0.0003
##    180        0.7150             nan     0.0100    0.0004
##    200        0.6944             nan     0.0100    0.0002
##    220        0.6748             nan     0.0100    0.0001
##    240        0.6575             nan     0.0100    0.0002
##    260        0.6429             nan     0.0100    0.0000
##    280        0.6295             nan     0.0100   -0.0001
##    300        0.6165             nan     0.0100   -0.0001
##    320        0.6045             nan     0.0100    0.0000
##    340        0.5933             nan     0.0100    0.0001
##    360        0.5833             nan     0.0100   -0.0000
##    380        0.5744             nan     0.0100    0.0001
##    400        0.5641             nan     0.0100   -0.0001
##    420        0.5555             nan     0.0100   -0.0001
##    440        0.5472             nan     0.0100   -0.0001
##    460        0.5395             nan     0.0100   -0.0001
##    480        0.5310             nan     0.0100   -0.0002
##    500        0.5231             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0045
##      2        1.3024             nan     0.0100    0.0047
##      3        1.2935             nan     0.0100    0.0039
##      4        1.2840             nan     0.0100    0.0041
##      5        1.2755             nan     0.0100    0.0042
##      6        1.2665             nan     0.0100    0.0037
##      7        1.2584             nan     0.0100    0.0033
##      8        1.2498             nan     0.0100    0.0035
##      9        1.2414             nan     0.0100    0.0034
##     10        1.2333             nan     0.0100    0.0036
##     20        1.1574             nan     0.0100    0.0034
##     40        1.0370             nan     0.0100    0.0024
##     60        0.9470             nan     0.0100    0.0016
##     80        0.8781             nan     0.0100    0.0011
##    100        0.8198             nan     0.0100    0.0011
##    120        0.7755             nan     0.0100    0.0008
##    140        0.7382             nan     0.0100    0.0004
##    160        0.7072             nan     0.0100    0.0004
##    180        0.6796             nan     0.0100    0.0005
##    200        0.6553             nan     0.0100    0.0003
##    220        0.6339             nan     0.0100    0.0001
##    240        0.6155             nan     0.0100    0.0003
##    260        0.5994             nan     0.0100   -0.0002
##    280        0.5827             nan     0.0100    0.0001
##    300        0.5687             nan     0.0100    0.0000
##    320        0.5557             nan     0.0100   -0.0001
##    340        0.5424             nan     0.0100   -0.0000
##    360        0.5304             nan     0.0100    0.0000
##    380        0.5185             nan     0.0100    0.0000
##    400        0.5077             nan     0.0100    0.0000
##    420        0.4970             nan     0.0100   -0.0000
##    440        0.4871             nan     0.0100    0.0000
##    460        0.4776             nan     0.0100    0.0000
##    480        0.4688             nan     0.0100   -0.0000
##    500        0.4599             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0046
##      2        1.3026             nan     0.0100    0.0042
##      3        1.2923             nan     0.0100    0.0040
##      4        1.2834             nan     0.0100    0.0045
##      5        1.2746             nan     0.0100    0.0038
##      6        1.2653             nan     0.0100    0.0042
##      7        1.2572             nan     0.0100    0.0038
##      8        1.2493             nan     0.0100    0.0037
##      9        1.2412             nan     0.0100    0.0038
##     10        1.2332             nan     0.0100    0.0035
##     20        1.1595             nan     0.0100    0.0028
##     40        1.0404             nan     0.0100    0.0025
##     60        0.9512             nan     0.0100    0.0019
##     80        0.8797             nan     0.0100    0.0013
##    100        0.8232             nan     0.0100    0.0007
##    120        0.7777             nan     0.0100    0.0010
##    140        0.7395             nan     0.0100    0.0006
##    160        0.7076             nan     0.0100    0.0003
##    180        0.6809             nan     0.0100    0.0003
##    200        0.6563             nan     0.0100    0.0001
##    220        0.6353             nan     0.0100    0.0002
##    240        0.6164             nan     0.0100    0.0000
##    260        0.5996             nan     0.0100    0.0001
##    280        0.5849             nan     0.0100    0.0001
##    300        0.5715             nan     0.0100    0.0001
##    320        0.5586             nan     0.0100   -0.0001
##    340        0.5469             nan     0.0100    0.0001
##    360        0.5352             nan     0.0100    0.0000
##    380        0.5238             nan     0.0100   -0.0001
##    400        0.5135             nan     0.0100   -0.0000
##    420        0.5037             nan     0.0100   -0.0001
##    440        0.4943             nan     0.0100   -0.0002
##    460        0.4851             nan     0.0100   -0.0002
##    480        0.4761             nan     0.0100   -0.0001
##    500        0.4676             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0044
##      2        1.3028             nan     0.0100    0.0034
##      3        1.2935             nan     0.0100    0.0045
##      4        1.2844             nan     0.0100    0.0041
##      5        1.2764             nan     0.0100    0.0037
##      6        1.2679             nan     0.0100    0.0040
##      7        1.2590             nan     0.0100    0.0040
##      8        1.2509             nan     0.0100    0.0036
##      9        1.2431             nan     0.0100    0.0033
##     10        1.2351             nan     0.0100    0.0034
##     20        1.1635             nan     0.0100    0.0031
##     40        1.0441             nan     0.0100    0.0020
##     60        0.9556             nan     0.0100    0.0015
##     80        0.8853             nan     0.0100    0.0013
##    100        0.8282             nan     0.0100    0.0012
##    120        0.7840             nan     0.0100    0.0008
##    140        0.7469             nan     0.0100    0.0006
##    160        0.7151             nan     0.0100    0.0004
##    180        0.6870             nan     0.0100    0.0002
##    200        0.6641             nan     0.0100    0.0003
##    220        0.6439             nan     0.0100   -0.0002
##    240        0.6270             nan     0.0100    0.0002
##    260        0.6111             nan     0.0100    0.0000
##    280        0.5965             nan     0.0100    0.0000
##    300        0.5826             nan     0.0100   -0.0000
##    320        0.5701             nan     0.0100    0.0001
##    340        0.5588             nan     0.0100   -0.0001
##    360        0.5471             nan     0.0100   -0.0002
##    380        0.5358             nan     0.0100    0.0000
##    400        0.5255             nan     0.0100    0.0000
##    420        0.5156             nan     0.0100   -0.0000
##    440        0.5064             nan     0.0100   -0.0001
##    460        0.4962             nan     0.0100   -0.0000
##    480        0.4881             nan     0.0100    0.0000
##    500        0.4797             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0046
##      2        1.3013             nan     0.0100    0.0047
##      3        1.2914             nan     0.0100    0.0044
##      4        1.2818             nan     0.0100    0.0043
##      5        1.2735             nan     0.0100    0.0041
##      6        1.2653             nan     0.0100    0.0037
##      7        1.2566             nan     0.0100    0.0038
##      8        1.2472             nan     0.0100    0.0043
##      9        1.2393             nan     0.0100    0.0032
##     10        1.2310             nan     0.0100    0.0034
##     20        1.1546             nan     0.0100    0.0032
##     40        1.0318             nan     0.0100    0.0026
##     60        0.9385             nan     0.0100    0.0017
##     80        0.8650             nan     0.0100    0.0012
##    100        0.8049             nan     0.0100    0.0008
##    120        0.7556             nan     0.0100    0.0009
##    140        0.7144             nan     0.0100    0.0004
##    160        0.6811             nan     0.0100    0.0005
##    180        0.6536             nan     0.0100    0.0002
##    200        0.6293             nan     0.0100    0.0005
##    220        0.6062             nan     0.0100    0.0001
##    240        0.5860             nan     0.0100    0.0002
##    260        0.5665             nan     0.0100    0.0002
##    280        0.5491             nan     0.0100    0.0002
##    300        0.5337             nan     0.0100   -0.0002
##    320        0.5199             nan     0.0100    0.0001
##    340        0.5059             nan     0.0100    0.0000
##    360        0.4937             nan     0.0100   -0.0001
##    380        0.4800             nan     0.0100    0.0001
##    400        0.4679             nan     0.0100    0.0000
##    420        0.4574             nan     0.0100   -0.0001
##    440        0.4464             nan     0.0100    0.0000
##    460        0.4361             nan     0.0100   -0.0001
##    480        0.4264             nan     0.0100   -0.0001
##    500        0.4175             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3108             nan     0.0100    0.0050
##      2        1.3010             nan     0.0100    0.0047
##      3        1.2907             nan     0.0100    0.0046
##      4        1.2815             nan     0.0100    0.0042
##      5        1.2725             nan     0.0100    0.0041
##      6        1.2632             nan     0.0100    0.0039
##      7        1.2540             nan     0.0100    0.0039
##      8        1.2451             nan     0.0100    0.0045
##      9        1.2367             nan     0.0100    0.0041
##     10        1.2286             nan     0.0100    0.0040
##     20        1.1528             nan     0.0100    0.0032
##     40        1.0306             nan     0.0100    0.0025
##     60        0.9357             nan     0.0100    0.0019
##     80        0.8622             nan     0.0100    0.0013
##    100        0.8019             nan     0.0100    0.0011
##    120        0.7550             nan     0.0100    0.0007
##    140        0.7167             nan     0.0100    0.0005
##    160        0.6832             nan     0.0100    0.0003
##    180        0.6557             nan     0.0100    0.0001
##    200        0.6309             nan     0.0100    0.0003
##    220        0.6083             nan     0.0100    0.0001
##    240        0.5886             nan     0.0100    0.0001
##    260        0.5708             nan     0.0100    0.0002
##    280        0.5545             nan     0.0100    0.0001
##    300        0.5401             nan     0.0100    0.0001
##    320        0.5260             nan     0.0100    0.0000
##    340        0.5118             nan     0.0100   -0.0001
##    360        0.4993             nan     0.0100   -0.0001
##    380        0.4880             nan     0.0100   -0.0000
##    400        0.4773             nan     0.0100    0.0000
##    420        0.4669             nan     0.0100    0.0001
##    440        0.4565             nan     0.0100   -0.0000
##    460        0.4462             nan     0.0100   -0.0001
##    480        0.4359             nan     0.0100   -0.0001
##    500        0.4264             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0044
##      2        1.3020             nan     0.0100    0.0041
##      3        1.2927             nan     0.0100    0.0045
##      4        1.2834             nan     0.0100    0.0043
##      5        1.2748             nan     0.0100    0.0042
##      6        1.2652             nan     0.0100    0.0044
##      7        1.2564             nan     0.0100    0.0040
##      8        1.2463             nan     0.0100    0.0043
##      9        1.2377             nan     0.0100    0.0035
##     10        1.2293             nan     0.0100    0.0039
##     20        1.1521             nan     0.0100    0.0030
##     40        1.0325             nan     0.0100    0.0022
##     60        0.9403             nan     0.0100    0.0017
##     80        0.8680             nan     0.0100    0.0011
##    100        0.8095             nan     0.0100    0.0008
##    120        0.7616             nan     0.0100    0.0007
##    140        0.7225             nan     0.0100    0.0005
##    160        0.6905             nan     0.0100    0.0003
##    180        0.6619             nan     0.0100    0.0003
##    200        0.6384             nan     0.0100    0.0002
##    220        0.6172             nan     0.0100    0.0004
##    240        0.5964             nan     0.0100    0.0003
##    260        0.5802             nan     0.0100    0.0000
##    280        0.5644             nan     0.0100   -0.0001
##    300        0.5495             nan     0.0100   -0.0000
##    320        0.5350             nan     0.0100   -0.0001
##    340        0.5225             nan     0.0100   -0.0001
##    360        0.5103             nan     0.0100   -0.0002
##    380        0.4986             nan     0.0100   -0.0002
##    400        0.4871             nan     0.0100    0.0000
##    420        0.4776             nan     0.0100   -0.0000
##    440        0.4672             nan     0.0100   -0.0000
##    460        0.4580             nan     0.0100   -0.0001
##    480        0.4485             nan     0.0100   -0.0001
##    500        0.4387             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2409             nan     0.1000    0.0376
##      2        1.1674             nan     0.1000    0.0334
##      3        1.1042             nan     0.1000    0.0302
##      4        1.0469             nan     0.1000    0.0250
##      5        1.0013             nan     0.1000    0.0187
##      6        0.9596             nan     0.1000    0.0181
##      7        0.9234             nan     0.1000    0.0141
##      8        0.8953             nan     0.1000    0.0105
##      9        0.8705             nan     0.1000    0.0121
##     10        0.8460             nan     0.1000    0.0080
##     20        0.6871             nan     0.1000    0.0038
##     40        0.5550             nan     0.1000   -0.0007
##     60        0.4788             nan     0.1000    0.0002
##     80        0.4161             nan     0.1000   -0.0001
##    100        0.3672             nan     0.1000   -0.0007
##    120        0.3215             nan     0.1000   -0.0007
##    140        0.2874             nan     0.1000   -0.0010
##    160        0.2593             nan     0.1000   -0.0006
##    180        0.2331             nan     0.1000   -0.0003
##    200        0.2091             nan     0.1000   -0.0000
##    220        0.1900             nan     0.1000   -0.0005
##    240        0.1725             nan     0.1000   -0.0005
##    260        0.1583             nan     0.1000   -0.0003
##    280        0.1450             nan     0.1000   -0.0002
##    300        0.1327             nan     0.1000   -0.0004
##    320        0.1221             nan     0.1000   -0.0004
##    340        0.1125             nan     0.1000   -0.0006
##    360        0.1022             nan     0.1000    0.0000
##    380        0.0945             nan     0.1000   -0.0001
##    400        0.0878             nan     0.1000   -0.0002
##    420        0.0804             nan     0.1000   -0.0005
##    440        0.0743             nan     0.1000   -0.0001
##    460        0.0688             nan     0.1000   -0.0002
##    480        0.0636             nan     0.1000   -0.0000
##    500        0.0589             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2389             nan     0.1000    0.0363
##      2        1.1628             nan     0.1000    0.0322
##      3        1.0994             nan     0.1000    0.0310
##      4        1.0475             nan     0.1000    0.0201
##      5        1.0022             nan     0.1000    0.0189
##      6        0.9618             nan     0.1000    0.0183
##      7        0.9246             nan     0.1000    0.0160
##      8        0.8950             nan     0.1000    0.0133
##      9        0.8704             nan     0.1000    0.0110
##     10        0.8399             nan     0.1000    0.0131
##     20        0.6891             nan     0.1000    0.0020
##     40        0.5571             nan     0.1000   -0.0003
##     60        0.4825             nan     0.1000   -0.0010
##     80        0.4206             nan     0.1000   -0.0007
##    100        0.3699             nan     0.1000   -0.0004
##    120        0.3311             nan     0.1000    0.0001
##    140        0.2961             nan     0.1000   -0.0012
##    160        0.2670             nan     0.1000   -0.0008
##    180        0.2400             nan     0.1000   -0.0010
##    200        0.2150             nan     0.1000   -0.0007
##    220        0.1956             nan     0.1000   -0.0006
##    240        0.1806             nan     0.1000   -0.0010
##    260        0.1633             nan     0.1000   -0.0009
##    280        0.1507             nan     0.1000   -0.0006
##    300        0.1371             nan     0.1000   -0.0004
##    320        0.1256             nan     0.1000   -0.0005
##    340        0.1162             nan     0.1000   -0.0002
##    360        0.1067             nan     0.1000   -0.0002
##    380        0.0981             nan     0.1000   -0.0003
##    400        0.0899             nan     0.1000   -0.0001
##    420        0.0822             nan     0.1000   -0.0003
##    440        0.0763             nan     0.1000   -0.0003
##    460        0.0695             nan     0.1000   -0.0001
##    480        0.0641             nan     0.1000   -0.0003
##    500        0.0595             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2392             nan     0.1000    0.0335
##      2        1.1654             nan     0.1000    0.0316
##      3        1.0993             nan     0.1000    0.0298
##      4        1.0537             nan     0.1000    0.0194
##      5        1.0079             nan     0.1000    0.0194
##      6        0.9666             nan     0.1000    0.0160
##      7        0.9312             nan     0.1000    0.0144
##      8        0.8988             nan     0.1000    0.0142
##      9        0.8726             nan     0.1000    0.0105
##     10        0.8498             nan     0.1000    0.0062
##     20        0.7075             nan     0.1000    0.0018
##     40        0.5776             nan     0.1000    0.0005
##     60        0.4946             nan     0.1000   -0.0011
##     80        0.4385             nan     0.1000   -0.0003
##    100        0.3841             nan     0.1000    0.0000
##    120        0.3490             nan     0.1000   -0.0009
##    140        0.3140             nan     0.1000   -0.0011
##    160        0.2843             nan     0.1000   -0.0018
##    180        0.2563             nan     0.1000   -0.0006
##    200        0.2319             nan     0.1000   -0.0006
##    220        0.2137             nan     0.1000   -0.0010
##    240        0.1953             nan     0.1000   -0.0004
##    260        0.1803             nan     0.1000   -0.0008
##    280        0.1645             nan     0.1000   -0.0001
##    300        0.1513             nan     0.1000   -0.0006
##    320        0.1396             nan     0.1000   -0.0005
##    340        0.1287             nan     0.1000   -0.0009
##    360        0.1192             nan     0.1000   -0.0005
##    380        0.1101             nan     0.1000   -0.0006
##    400        0.1027             nan     0.1000   -0.0001
##    420        0.0944             nan     0.1000   -0.0004
##    440        0.0867             nan     0.1000   -0.0003
##    460        0.0808             nan     0.1000   -0.0003
##    480        0.0753             nan     0.1000   -0.0003
##    500        0.0703             nan     0.1000   -0.0004
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2298             nan     0.1000    0.0408
##      2        1.1522             nan     0.1000    0.0339
##      3        1.0882             nan     0.1000    0.0291
##      4        1.0342             nan     0.1000    0.0221
##      5        0.9854             nan     0.1000    0.0226
##      6        0.9468             nan     0.1000    0.0161
##      7        0.9061             nan     0.1000    0.0171
##      8        0.8728             nan     0.1000    0.0141
##      9        0.8452             nan     0.1000    0.0115
##     10        0.8197             nan     0.1000    0.0089
##     20        0.6470             nan     0.1000    0.0015
##     40        0.5027             nan     0.1000    0.0006
##     60        0.4175             nan     0.1000   -0.0006
##     80        0.3599             nan     0.1000   -0.0010
##    100        0.3105             nan     0.1000    0.0001
##    120        0.2732             nan     0.1000   -0.0002
##    140        0.2367             nan     0.1000   -0.0009
##    160        0.2086             nan     0.1000    0.0000
##    180        0.1832             nan     0.1000   -0.0010
##    200        0.1601             nan     0.1000   -0.0003
##    220        0.1425             nan     0.1000   -0.0004
##    240        0.1258             nan     0.1000   -0.0000
##    260        0.1142             nan     0.1000   -0.0003
##    280        0.1034             nan     0.1000   -0.0002
##    300        0.0934             nan     0.1000   -0.0006
##    320        0.0843             nan     0.1000   -0.0000
##    340        0.0766             nan     0.1000   -0.0003
##    360        0.0691             nan     0.1000   -0.0001
##    380        0.0612             nan     0.1000   -0.0001
##    400        0.0557             nan     0.1000   -0.0002
##    420        0.0501             nan     0.1000   -0.0001
##    440        0.0452             nan     0.1000   -0.0002
##    460        0.0412             nan     0.1000   -0.0002
##    480        0.0374             nan     0.1000   -0.0002
##    500        0.0341             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2294             nan     0.1000    0.0424
##      2        1.1634             nan     0.1000    0.0297
##      3        1.0947             nan     0.1000    0.0303
##      4        1.0381             nan     0.1000    0.0260
##      5        0.9853             nan     0.1000    0.0207
##      6        0.9425             nan     0.1000    0.0175
##      7        0.9032             nan     0.1000    0.0147
##      8        0.8719             nan     0.1000    0.0120
##      9        0.8412             nan     0.1000    0.0125
##     10        0.8199             nan     0.1000    0.0073
##     20        0.6567             nan     0.1000    0.0041
##     40        0.5128             nan     0.1000   -0.0006
##     60        0.4234             nan     0.1000    0.0004
##     80        0.3618             nan     0.1000   -0.0011
##    100        0.3160             nan     0.1000   -0.0009
##    120        0.2769             nan     0.1000    0.0002
##    140        0.2437             nan     0.1000   -0.0012
##    160        0.2150             nan     0.1000    0.0003
##    180        0.1885             nan     0.1000   -0.0008
##    200        0.1672             nan     0.1000   -0.0003
##    220        0.1512             nan     0.1000   -0.0005
##    240        0.1366             nan     0.1000   -0.0004
##    260        0.1239             nan     0.1000   -0.0007
##    280        0.1126             nan     0.1000   -0.0005
##    300        0.0990             nan     0.1000   -0.0006
##    320        0.0892             nan     0.1000   -0.0002
##    340        0.0806             nan     0.1000   -0.0005
##    360        0.0728             nan     0.1000   -0.0003
##    380        0.0661             nan     0.1000   -0.0003
##    400        0.0604             nan     0.1000   -0.0006
##    420        0.0552             nan     0.1000   -0.0002
##    440        0.0493             nan     0.1000   -0.0001
##    460        0.0441             nan     0.1000   -0.0002
##    480        0.0396             nan     0.1000   -0.0001
##    500        0.0358             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2255             nan     0.1000    0.0440
##      2        1.1555             nan     0.1000    0.0312
##      3        1.0908             nan     0.1000    0.0280
##      4        1.0356             nan     0.1000    0.0232
##      5        0.9901             nan     0.1000    0.0201
##      6        0.9504             nan     0.1000    0.0152
##      7        0.9130             nan     0.1000    0.0170
##      8        0.8796             nan     0.1000    0.0154
##      9        0.8502             nan     0.1000    0.0118
##     10        0.8266             nan     0.1000    0.0078
##     20        0.6706             nan     0.1000    0.0044
##     40        0.5387             nan     0.1000   -0.0010
##     60        0.4603             nan     0.1000   -0.0013
##     80        0.3815             nan     0.1000   -0.0001
##    100        0.3247             nan     0.1000   -0.0013
##    120        0.2812             nan     0.1000   -0.0001
##    140        0.2488             nan     0.1000   -0.0011
##    160        0.2181             nan     0.1000   -0.0001
##    180        0.1931             nan     0.1000   -0.0006
##    200        0.1731             nan     0.1000   -0.0010
##    220        0.1530             nan     0.1000   -0.0001
##    240        0.1374             nan     0.1000   -0.0008
##    260        0.1221             nan     0.1000   -0.0003
##    280        0.1092             nan     0.1000   -0.0002
##    300        0.0986             nan     0.1000   -0.0001
##    320        0.0887             nan     0.1000   -0.0000
##    340        0.0801             nan     0.1000   -0.0004
##    360        0.0724             nan     0.1000   -0.0003
##    380        0.0651             nan     0.1000   -0.0003
##    400        0.0595             nan     0.1000   -0.0001
##    420        0.0545             nan     0.1000   -0.0001
##    440        0.0496             nan     0.1000   -0.0003
##    460        0.0453             nan     0.1000   -0.0002
##    480        0.0407             nan     0.1000   -0.0001
##    500        0.0369             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2284             nan     0.1000    0.0417
##      2        1.1512             nan     0.1000    0.0340
##      3        1.0809             nan     0.1000    0.0296
##      4        1.0205             nan     0.1000    0.0230
##      5        0.9744             nan     0.1000    0.0204
##      6        0.9341             nan     0.1000    0.0152
##      7        0.8929             nan     0.1000    0.0196
##      8        0.8646             nan     0.1000    0.0101
##      9        0.8326             nan     0.1000    0.0120
##     10        0.8052             nan     0.1000    0.0103
##     20        0.6331             nan     0.1000    0.0027
##     40        0.4766             nan     0.1000   -0.0007
##     60        0.3830             nan     0.1000    0.0011
##     80        0.3156             nan     0.1000   -0.0008
##    100        0.2633             nan     0.1000   -0.0002
##    120        0.2226             nan     0.1000   -0.0000
##    140        0.1930             nan     0.1000   -0.0004
##    160        0.1662             nan     0.1000   -0.0009
##    180        0.1441             nan     0.1000   -0.0000
##    200        0.1272             nan     0.1000   -0.0004
##    220        0.1101             nan     0.1000   -0.0004
##    240        0.0956             nan     0.1000   -0.0003
##    260        0.0844             nan     0.1000   -0.0002
##    280        0.0741             nan     0.1000   -0.0001
##    300        0.0654             nan     0.1000   -0.0002
##    320        0.0569             nan     0.1000   -0.0001
##    340        0.0502             nan     0.1000   -0.0001
##    360        0.0446             nan     0.1000   -0.0001
##    380        0.0393             nan     0.1000   -0.0001
##    400        0.0351             nan     0.1000   -0.0001
##    420        0.0319             nan     0.1000   -0.0001
##    440        0.0288             nan     0.1000   -0.0001
##    460        0.0252             nan     0.1000   -0.0001
##    480        0.0226             nan     0.1000   -0.0000
##    500        0.0200             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2328             nan     0.1000    0.0378
##      2        1.1574             nan     0.1000    0.0339
##      3        1.0888             nan     0.1000    0.0318
##      4        1.0274             nan     0.1000    0.0282
##      5        0.9751             nan     0.1000    0.0238
##      6        0.9361             nan     0.1000    0.0150
##      7        0.8962             nan     0.1000    0.0158
##      8        0.8629             nan     0.1000    0.0136
##      9        0.8305             nan     0.1000    0.0114
##     10        0.8035             nan     0.1000    0.0081
##     20        0.6321             nan     0.1000    0.0015
##     40        0.4800             nan     0.1000   -0.0007
##     60        0.3893             nan     0.1000   -0.0005
##     80        0.3280             nan     0.1000   -0.0002
##    100        0.2727             nan     0.1000   -0.0007
##    120        0.2275             nan     0.1000   -0.0010
##    140        0.1918             nan     0.1000   -0.0005
##    160        0.1641             nan     0.1000   -0.0007
##    180        0.1404             nan     0.1000   -0.0007
##    200        0.1215             nan     0.1000   -0.0005
##    220        0.1061             nan     0.1000   -0.0008
##    240        0.0917             nan     0.1000   -0.0003
##    260        0.0801             nan     0.1000   -0.0003
##    280        0.0705             nan     0.1000   -0.0001
##    300        0.0616             nan     0.1000   -0.0003
##    320        0.0548             nan     0.1000   -0.0001
##    340        0.0476             nan     0.1000   -0.0001
##    360        0.0416             nan     0.1000   -0.0003
##    380        0.0369             nan     0.1000   -0.0002
##    400        0.0330             nan     0.1000   -0.0002
##    420        0.0291             nan     0.1000   -0.0000
##    440        0.0257             nan     0.1000   -0.0001
##    460        0.0226             nan     0.1000   -0.0000
##    480        0.0201             nan     0.1000   -0.0001
##    500        0.0178             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2331             nan     0.1000    0.0367
##      2        1.1535             nan     0.1000    0.0363
##      3        1.0821             nan     0.1000    0.0310
##      4        1.0267             nan     0.1000    0.0248
##      5        0.9776             nan     0.1000    0.0221
##      6        0.9358             nan     0.1000    0.0186
##      7        0.8964             nan     0.1000    0.0165
##      8        0.8613             nan     0.1000    0.0152
##      9        0.8321             nan     0.1000    0.0098
##     10        0.8041             nan     0.1000    0.0116
##     20        0.6353             nan     0.1000    0.0019
##     40        0.4888             nan     0.1000   -0.0007
##     60        0.3940             nan     0.1000   -0.0007
##     80        0.3318             nan     0.1000   -0.0006
##    100        0.2794             nan     0.1000   -0.0008
##    120        0.2377             nan     0.1000   -0.0010
##    140        0.2045             nan     0.1000   -0.0010
##    160        0.1763             nan     0.1000   -0.0012
##    180        0.1523             nan     0.1000   -0.0003
##    200        0.1315             nan     0.1000   -0.0008
##    220        0.1154             nan     0.1000   -0.0005
##    240        0.1024             nan     0.1000   -0.0002
##    260        0.0914             nan     0.1000   -0.0006
##    280        0.0814             nan     0.1000   -0.0003
##    300        0.0711             nan     0.1000   -0.0000
##    320        0.0632             nan     0.1000   -0.0002
##    340        0.0561             nan     0.1000   -0.0001
##    360        0.0503             nan     0.1000   -0.0002
##    380        0.0447             nan     0.1000   -0.0002
##    400        0.0397             nan     0.1000   -0.0001
##    420        0.0352             nan     0.1000   -0.0001
##    440        0.0314             nan     0.1000   -0.0001
##    460        0.0278             nan     0.1000   -0.0001
##    480        0.0249             nan     0.1000   -0.0002
##    500        0.0225             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0003
##      2        1.3191             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0004
##     20        1.3039             nan     0.0010    0.0003
##     40        1.2876             nan     0.0010    0.0004
##     60        1.2717             nan     0.0010    0.0003
##     80        1.2566             nan     0.0010    0.0003
##    100        1.2423             nan     0.0010    0.0002
##    120        1.2281             nan     0.0010    0.0003
##    140        1.2143             nan     0.0010    0.0003
##    160        1.2013             nan     0.0010    0.0003
##    180        1.1885             nan     0.0010    0.0003
##    200        1.1762             nan     0.0010    0.0003
##    220        1.1636             nan     0.0010    0.0002
##    240        1.1518             nan     0.0010    0.0003
##    260        1.1402             nan     0.0010    0.0003
##    280        1.1289             nan     0.0010    0.0003
##    300        1.1180             nan     0.0010    0.0002
##    320        1.1072             nan     0.0010    0.0002
##    340        1.0970             nan     0.0010    0.0002
##    360        1.0870             nan     0.0010    0.0002
##    380        1.0773             nan     0.0010    0.0002
##    400        1.0677             nan     0.0010    0.0002
##    420        1.0587             nan     0.0010    0.0002
##    440        1.0495             nan     0.0010    0.0002
##    460        1.0409             nan     0.0010    0.0002
##    480        1.0324             nan     0.0010    0.0002
##    500        1.0240             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0003
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3036             nan     0.0010    0.0003
##     40        1.2875             nan     0.0010    0.0004
##     60        1.2721             nan     0.0010    0.0003
##     80        1.2567             nan     0.0010    0.0004
##    100        1.2421             nan     0.0010    0.0003
##    120        1.2282             nan     0.0010    0.0003
##    140        1.2145             nan     0.0010    0.0003
##    160        1.2010             nan     0.0010    0.0003
##    180        1.1880             nan     0.0010    0.0003
##    200        1.1756             nan     0.0010    0.0002
##    220        1.1636             nan     0.0010    0.0002
##    240        1.1517             nan     0.0010    0.0003
##    260        1.1405             nan     0.0010    0.0003
##    280        1.1294             nan     0.0010    0.0003
##    300        1.1186             nan     0.0010    0.0002
##    320        1.1083             nan     0.0010    0.0002
##    340        1.0982             nan     0.0010    0.0002
##    360        1.0881             nan     0.0010    0.0002
##    380        1.0782             nan     0.0010    0.0002
##    400        1.0688             nan     0.0010    0.0002
##    420        1.0598             nan     0.0010    0.0002
##    440        1.0507             nan     0.0010    0.0002
##    460        1.0418             nan     0.0010    0.0002
##    480        1.0334             nan     0.0010    0.0002
##    500        1.0250             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3183             nan     0.0010    0.0003
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0003
##      7        1.3149             nan     0.0010    0.0003
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3043             nan     0.0010    0.0003
##     40        1.2883             nan     0.0010    0.0003
##     60        1.2731             nan     0.0010    0.0003
##     80        1.2579             nan     0.0010    0.0004
##    100        1.2434             nan     0.0010    0.0003
##    120        1.2295             nan     0.0010    0.0003
##    140        1.2160             nan     0.0010    0.0003
##    160        1.2029             nan     0.0010    0.0003
##    180        1.1900             nan     0.0010    0.0003
##    200        1.1776             nan     0.0010    0.0003
##    220        1.1654             nan     0.0010    0.0003
##    240        1.1537             nan     0.0010    0.0003
##    260        1.1424             nan     0.0010    0.0002
##    280        1.1314             nan     0.0010    0.0002
##    300        1.1206             nan     0.0010    0.0003
##    320        1.1099             nan     0.0010    0.0002
##    340        1.0995             nan     0.0010    0.0002
##    360        1.0896             nan     0.0010    0.0002
##    380        1.0800             nan     0.0010    0.0002
##    400        1.0708             nan     0.0010    0.0002
##    420        1.0615             nan     0.0010    0.0002
##    440        1.0525             nan     0.0010    0.0002
##    460        1.0438             nan     0.0010    0.0002
##    480        1.0351             nan     0.0010    0.0002
##    500        1.0268             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0005
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2687             nan     0.0010    0.0004
##     80        1.2524             nan     0.0010    0.0003
##    100        1.2368             nan     0.0010    0.0004
##    120        1.2220             nan     0.0010    0.0003
##    140        1.2077             nan     0.0010    0.0003
##    160        1.1936             nan     0.0010    0.0003
##    180        1.1799             nan     0.0010    0.0003
##    200        1.1668             nan     0.0010    0.0003
##    220        1.1541             nan     0.0010    0.0002
##    240        1.1418             nan     0.0010    0.0003
##    260        1.1298             nan     0.0010    0.0003
##    280        1.1180             nan     0.0010    0.0002
##    300        1.1063             nan     0.0010    0.0002
##    320        1.0952             nan     0.0010    0.0002
##    340        1.0846             nan     0.0010    0.0002
##    360        1.0741             nan     0.0010    0.0002
##    380        1.0641             nan     0.0010    0.0002
##    400        1.0543             nan     0.0010    0.0002
##    420        1.0443             nan     0.0010    0.0002
##    440        1.0347             nan     0.0010    0.0002
##    460        1.0254             nan     0.0010    0.0002
##    480        1.0165             nan     0.0010    0.0002
##    500        1.0077             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0003
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2857             nan     0.0010    0.0004
##     60        1.2692             nan     0.0010    0.0004
##     80        1.2532             nan     0.0010    0.0003
##    100        1.2378             nan     0.0010    0.0004
##    120        1.2229             nan     0.0010    0.0003
##    140        1.2084             nan     0.0010    0.0003
##    160        1.1944             nan     0.0010    0.0003
##    180        1.1809             nan     0.0010    0.0003
##    200        1.1677             nan     0.0010    0.0003
##    220        1.1548             nan     0.0010    0.0003
##    240        1.1425             nan     0.0010    0.0003
##    260        1.1305             nan     0.0010    0.0002
##    280        1.1186             nan     0.0010    0.0003
##    300        1.1072             nan     0.0010    0.0003
##    320        1.0962             nan     0.0010    0.0003
##    340        1.0853             nan     0.0010    0.0002
##    360        1.0748             nan     0.0010    0.0002
##    380        1.0644             nan     0.0010    0.0002
##    400        1.0547             nan     0.0010    0.0002
##    420        1.0450             nan     0.0010    0.0002
##    440        1.0356             nan     0.0010    0.0002
##    460        1.0263             nan     0.0010    0.0002
##    480        1.0174             nan     0.0010    0.0002
##    500        1.0088             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0003
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2862             nan     0.0010    0.0003
##     60        1.2700             nan     0.0010    0.0003
##     80        1.2543             nan     0.0010    0.0004
##    100        1.2386             nan     0.0010    0.0003
##    120        1.2241             nan     0.0010    0.0003
##    140        1.2101             nan     0.0010    0.0003
##    160        1.1963             nan     0.0010    0.0003
##    180        1.1830             nan     0.0010    0.0003
##    200        1.1698             nan     0.0010    0.0003
##    220        1.1572             nan     0.0010    0.0003
##    240        1.1448             nan     0.0010    0.0002
##    260        1.1327             nan     0.0010    0.0003
##    280        1.1208             nan     0.0010    0.0003
##    300        1.1093             nan     0.0010    0.0003
##    320        1.0983             nan     0.0010    0.0002
##    340        1.0876             nan     0.0010    0.0002
##    360        1.0771             nan     0.0010    0.0002
##    380        1.0670             nan     0.0010    0.0002
##    400        1.0570             nan     0.0010    0.0002
##    420        1.0477             nan     0.0010    0.0002
##    440        1.0384             nan     0.0010    0.0002
##    460        1.0292             nan     0.0010    0.0002
##    480        1.0203             nan     0.0010    0.0002
##    500        1.0116             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0005
##      5        1.3158             nan     0.0010    0.0005
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2842             nan     0.0010    0.0004
##     60        1.2670             nan     0.0010    0.0004
##     80        1.2502             nan     0.0010    0.0003
##    100        1.2339             nan     0.0010    0.0003
##    120        1.2184             nan     0.0010    0.0003
##    140        1.2026             nan     0.0010    0.0003
##    160        1.1879             nan     0.0010    0.0003
##    180        1.1737             nan     0.0010    0.0003
##    200        1.1600             nan     0.0010    0.0003
##    220        1.1464             nan     0.0010    0.0003
##    240        1.1335             nan     0.0010    0.0003
##    260        1.1206             nan     0.0010    0.0002
##    280        1.1082             nan     0.0010    0.0002
##    300        1.0963             nan     0.0010    0.0002
##    320        1.0847             nan     0.0010    0.0003
##    340        1.0733             nan     0.0010    0.0003
##    360        1.0622             nan     0.0010    0.0002
##    380        1.0516             nan     0.0010    0.0002
##    400        1.0410             nan     0.0010    0.0002
##    420        1.0307             nan     0.0010    0.0003
##    440        1.0210             nan     0.0010    0.0002
##    460        1.0114             nan     0.0010    0.0002
##    480        1.0020             nan     0.0010    0.0002
##    500        0.9928             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0005
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2841             nan     0.0010    0.0004
##     60        1.2670             nan     0.0010    0.0004
##     80        1.2506             nan     0.0010    0.0004
##    100        1.2344             nan     0.0010    0.0003
##    120        1.2190             nan     0.0010    0.0003
##    140        1.2040             nan     0.0010    0.0003
##    160        1.1892             nan     0.0010    0.0003
##    180        1.1753             nan     0.0010    0.0003
##    200        1.1612             nan     0.0010    0.0003
##    220        1.1478             nan     0.0010    0.0003
##    240        1.1349             nan     0.0010    0.0003
##    260        1.1224             nan     0.0010    0.0003
##    280        1.1103             nan     0.0010    0.0003
##    300        1.0984             nan     0.0010    0.0003
##    320        1.0867             nan     0.0010    0.0002
##    340        1.0757             nan     0.0010    0.0002
##    360        1.0646             nan     0.0010    0.0002
##    380        1.0542             nan     0.0010    0.0002
##    400        1.0438             nan     0.0010    0.0002
##    420        1.0337             nan     0.0010    0.0002
##    440        1.0240             nan     0.0010    0.0002
##    460        1.0144             nan     0.0010    0.0002
##    480        1.0049             nan     0.0010    0.0002
##    500        0.9958             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2849             nan     0.0010    0.0004
##     60        1.2680             nan     0.0010    0.0004
##     80        1.2517             nan     0.0010    0.0004
##    100        1.2356             nan     0.0010    0.0004
##    120        1.2202             nan     0.0010    0.0003
##    140        1.2054             nan     0.0010    0.0003
##    160        1.1913             nan     0.0010    0.0003
##    180        1.1775             nan     0.0010    0.0003
##    200        1.1639             nan     0.0010    0.0003
##    220        1.1509             nan     0.0010    0.0003
##    240        1.1381             nan     0.0010    0.0003
##    260        1.1259             nan     0.0010    0.0002
##    280        1.1136             nan     0.0010    0.0003
##    300        1.1019             nan     0.0010    0.0002
##    320        1.0905             nan     0.0010    0.0003
##    340        1.0794             nan     0.0010    0.0002
##    360        1.0686             nan     0.0010    0.0002
##    380        1.0578             nan     0.0010    0.0002
##    400        1.0475             nan     0.0010    0.0002
##    420        1.0375             nan     0.0010    0.0002
##    440        1.0277             nan     0.0010    0.0002
##    460        1.0185             nan     0.0010    0.0002
##    480        1.0091             nan     0.0010    0.0002
##    500        1.0000             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0040
##      2        1.3020             nan     0.0100    0.0040
##      3        1.2941             nan     0.0100    0.0036
##      4        1.2858             nan     0.0100    0.0037
##      5        1.2780             nan     0.0100    0.0034
##      6        1.2700             nan     0.0100    0.0036
##      7        1.2621             nan     0.0100    0.0034
##      8        1.2556             nan     0.0100    0.0031
##      9        1.2482             nan     0.0100    0.0032
##     10        1.2410             nan     0.0100    0.0034
##     20        1.1745             nan     0.0100    0.0026
##     40        1.0684             nan     0.0100    0.0023
##     60        0.9852             nan     0.0100    0.0017
##     80        0.9191             nan     0.0100    0.0011
##    100        0.8686             nan     0.0100    0.0007
##    120        0.8268             nan     0.0100    0.0007
##    140        0.7922             nan     0.0100    0.0005
##    160        0.7620             nan     0.0100    0.0003
##    180        0.7360             nan     0.0100    0.0004
##    200        0.7137             nan     0.0100    0.0003
##    220        0.6938             nan     0.0100    0.0001
##    240        0.6761             nan     0.0100    0.0001
##    260        0.6596             nan     0.0100   -0.0001
##    280        0.6450             nan     0.0100    0.0002
##    300        0.6327             nan     0.0100    0.0000
##    320        0.6204             nan     0.0100   -0.0001
##    340        0.6086             nan     0.0100   -0.0001
##    360        0.5980             nan     0.0100   -0.0000
##    380        0.5882             nan     0.0100   -0.0000
##    400        0.5781             nan     0.0100   -0.0001
##    420        0.5684             nan     0.0100   -0.0001
##    440        0.5586             nan     0.0100   -0.0000
##    460        0.5505             nan     0.0100   -0.0001
##    480        0.5414             nan     0.0100    0.0001
##    500        0.5335             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0037
##      2        1.3038             nan     0.0100    0.0036
##      3        1.2955             nan     0.0100    0.0035
##      4        1.2872             nan     0.0100    0.0036
##      5        1.2795             nan     0.0100    0.0036
##      6        1.2722             nan     0.0100    0.0032
##      7        1.2650             nan     0.0100    0.0032
##      8        1.2578             nan     0.0100    0.0029
##      9        1.2499             nan     0.0100    0.0035
##     10        1.2421             nan     0.0100    0.0037
##     20        1.1743             nan     0.0100    0.0030
##     40        1.0657             nan     0.0100    0.0020
##     60        0.9846             nan     0.0100    0.0013
##     80        0.9222             nan     0.0100    0.0011
##    100        0.8685             nan     0.0100    0.0009
##    120        0.8264             nan     0.0100    0.0006
##    140        0.7909             nan     0.0100    0.0005
##    160        0.7603             nan     0.0100    0.0004
##    180        0.7351             nan     0.0100    0.0001
##    200        0.7135             nan     0.0100    0.0002
##    220        0.6926             nan     0.0100    0.0002
##    240        0.6762             nan     0.0100    0.0001
##    260        0.6613             nan     0.0100    0.0002
##    280        0.6470             nan     0.0100    0.0001
##    300        0.6340             nan     0.0100    0.0000
##    320        0.6217             nan     0.0100   -0.0001
##    340        0.6102             nan     0.0100   -0.0001
##    360        0.6001             nan     0.0100   -0.0000
##    380        0.5892             nan     0.0100   -0.0000
##    400        0.5798             nan     0.0100    0.0000
##    420        0.5707             nan     0.0100    0.0000
##    440        0.5615             nan     0.0100   -0.0001
##    460        0.5530             nan     0.0100    0.0000
##    480        0.5447             nan     0.0100    0.0001
##    500        0.5367             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3125             nan     0.0100    0.0036
##      2        1.3048             nan     0.0100    0.0034
##      3        1.2963             nan     0.0100    0.0036
##      4        1.2881             nan     0.0100    0.0040
##      5        1.2807             nan     0.0100    0.0033
##      6        1.2724             nan     0.0100    0.0036
##      7        1.2643             nan     0.0100    0.0037
##      8        1.2574             nan     0.0100    0.0029
##      9        1.2496             nan     0.0100    0.0033
##     10        1.2421             nan     0.0100    0.0033
##     20        1.1765             nan     0.0100    0.0027
##     40        1.0693             nan     0.0100    0.0018
##     60        0.9904             nan     0.0100    0.0015
##     80        0.9269             nan     0.0100    0.0012
##    100        0.8742             nan     0.0100    0.0008
##    120        0.8303             nan     0.0100    0.0006
##    140        0.7943             nan     0.0100    0.0006
##    160        0.7647             nan     0.0100    0.0003
##    180        0.7394             nan     0.0100    0.0004
##    200        0.7176             nan     0.0100    0.0002
##    220        0.6992             nan     0.0100    0.0001
##    240        0.6816             nan     0.0100    0.0000
##    260        0.6663             nan     0.0100    0.0001
##    280        0.6528             nan     0.0100    0.0001
##    300        0.6398             nan     0.0100    0.0001
##    320        0.6283             nan     0.0100   -0.0000
##    340        0.6168             nan     0.0100   -0.0000
##    360        0.6063             nan     0.0100   -0.0000
##    380        0.5969             nan     0.0100    0.0001
##    400        0.5875             nan     0.0100   -0.0000
##    420        0.5782             nan     0.0100    0.0000
##    440        0.5690             nan     0.0100   -0.0000
##    460        0.5611             nan     0.0100   -0.0000
##    480        0.5529             nan     0.0100    0.0000
##    500        0.5457             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3025             nan     0.0100    0.0041
##      3        1.2936             nan     0.0100    0.0040
##      4        1.2852             nan     0.0100    0.0036
##      5        1.2773             nan     0.0100    0.0034
##      6        1.2692             nan     0.0100    0.0035
##      7        1.2614             nan     0.0100    0.0039
##      8        1.2532             nan     0.0100    0.0033
##      9        1.2453             nan     0.0100    0.0036
##     10        1.2373             nan     0.0100    0.0037
##     20        1.1686             nan     0.0100    0.0028
##     40        1.0558             nan     0.0100    0.0019
##     60        0.9685             nan     0.0100    0.0018
##     80        0.8992             nan     0.0100    0.0012
##    100        0.8450             nan     0.0100    0.0007
##    120        0.8001             nan     0.0100    0.0005
##    140        0.7625             nan     0.0100    0.0004
##    160        0.7314             nan     0.0100    0.0002
##    180        0.7053             nan     0.0100    0.0003
##    200        0.6801             nan     0.0100    0.0004
##    220        0.6590             nan     0.0100    0.0001
##    240        0.6404             nan     0.0100   -0.0000
##    260        0.6228             nan     0.0100    0.0000
##    280        0.6070             nan     0.0100   -0.0000
##    300        0.5921             nan     0.0100   -0.0000
##    320        0.5774             nan     0.0100   -0.0000
##    340        0.5650             nan     0.0100   -0.0001
##    360        0.5531             nan     0.0100   -0.0000
##    380        0.5421             nan     0.0100   -0.0001
##    400        0.5312             nan     0.0100    0.0000
##    420        0.5197             nan     0.0100   -0.0000
##    440        0.5097             nan     0.0100   -0.0000
##    460        0.4997             nan     0.0100   -0.0001
##    480        0.4902             nan     0.0100   -0.0000
##    500        0.4814             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0044
##      2        1.3022             nan     0.0100    0.0040
##      3        1.2933             nan     0.0100    0.0044
##      4        1.2853             nan     0.0100    0.0034
##      5        1.2769             nan     0.0100    0.0037
##      6        1.2689             nan     0.0100    0.0040
##      7        1.2601             nan     0.0100    0.0038
##      8        1.2520             nan     0.0100    0.0036
##      9        1.2437             nan     0.0100    0.0035
##     10        1.2363             nan     0.0100    0.0033
##     20        1.1655             nan     0.0100    0.0029
##     40        1.0539             nan     0.0100    0.0022
##     60        0.9662             nan     0.0100    0.0015
##     80        0.8992             nan     0.0100    0.0011
##    100        0.8445             nan     0.0100    0.0011
##    120        0.8007             nan     0.0100    0.0005
##    140        0.7638             nan     0.0100    0.0005
##    160        0.7326             nan     0.0100    0.0005
##    180        0.7055             nan     0.0100    0.0003
##    200        0.6821             nan     0.0100    0.0003
##    220        0.6613             nan     0.0100    0.0001
##    240        0.6428             nan     0.0100    0.0003
##    260        0.6259             nan     0.0100    0.0002
##    280        0.6106             nan     0.0100    0.0002
##    300        0.5975             nan     0.0100    0.0002
##    320        0.5834             nan     0.0100   -0.0000
##    340        0.5711             nan     0.0100    0.0001
##    360        0.5588             nan     0.0100   -0.0001
##    380        0.5463             nan     0.0100   -0.0001
##    400        0.5366             nan     0.0100   -0.0001
##    420        0.5267             nan     0.0100    0.0000
##    440        0.5162             nan     0.0100    0.0000
##    460        0.5061             nan     0.0100    0.0001
##    480        0.4960             nan     0.0100   -0.0001
##    500        0.4871             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0041
##      2        1.3025             nan     0.0100    0.0041
##      3        1.2945             nan     0.0100    0.0036
##      4        1.2864             nan     0.0100    0.0035
##      5        1.2780             nan     0.0100    0.0037
##      6        1.2693             nan     0.0100    0.0036
##      7        1.2617             nan     0.0100    0.0032
##      8        1.2546             nan     0.0100    0.0031
##      9        1.2476             nan     0.0100    0.0030
##     10        1.2403             nan     0.0100    0.0032
##     20        1.1691             nan     0.0100    0.0030
##     40        1.0578             nan     0.0100    0.0019
##     60        0.9729             nan     0.0100    0.0015
##     80        0.9041             nan     0.0100    0.0014
##    100        0.8495             nan     0.0100    0.0009
##    120        0.8059             nan     0.0100    0.0007
##    140        0.7689             nan     0.0100    0.0005
##    160        0.7353             nan     0.0100    0.0002
##    180        0.7103             nan     0.0100    0.0003
##    200        0.6877             nan     0.0100    0.0001
##    220        0.6674             nan     0.0100    0.0002
##    240        0.6491             nan     0.0100    0.0001
##    260        0.6318             nan     0.0100    0.0001
##    280        0.6156             nan     0.0100    0.0002
##    300        0.6011             nan     0.0100    0.0000
##    320        0.5892             nan     0.0100    0.0001
##    340        0.5776             nan     0.0100   -0.0001
##    360        0.5669             nan     0.0100    0.0001
##    380        0.5548             nan     0.0100    0.0001
##    400        0.5447             nan     0.0100   -0.0001
##    420        0.5343             nan     0.0100    0.0000
##    440        0.5248             nan     0.0100    0.0001
##    460        0.5165             nan     0.0100   -0.0001
##    480        0.5074             nan     0.0100   -0.0001
##    500        0.4987             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3109             nan     0.0100    0.0044
##      2        1.3022             nan     0.0100    0.0040
##      3        1.2934             nan     0.0100    0.0041
##      4        1.2845             nan     0.0100    0.0040
##      5        1.2760             nan     0.0100    0.0038
##      6        1.2671             nan     0.0100    0.0041
##      7        1.2582             nan     0.0100    0.0038
##      8        1.2501             nan     0.0100    0.0035
##      9        1.2423             nan     0.0100    0.0038
##     10        1.2338             nan     0.0100    0.0037
##     20        1.1617             nan     0.0100    0.0024
##     40        1.0422             nan     0.0100    0.0021
##     60        0.9512             nan     0.0100    0.0017
##     80        0.8787             nan     0.0100    0.0010
##    100        0.8225             nan     0.0100    0.0009
##    120        0.7744             nan     0.0100    0.0007
##    140        0.7355             nan     0.0100    0.0003
##    160        0.7026             nan     0.0100    0.0004
##    180        0.6736             nan     0.0100    0.0003
##    200        0.6478             nan     0.0100    0.0003
##    220        0.6246             nan     0.0100    0.0002
##    240        0.6037             nan     0.0100    0.0003
##    260        0.5862             nan     0.0100    0.0001
##    280        0.5688             nan     0.0100   -0.0000
##    300        0.5518             nan     0.0100    0.0001
##    320        0.5374             nan     0.0100    0.0001
##    340        0.5236             nan     0.0100    0.0001
##    360        0.5107             nan     0.0100   -0.0000
##    380        0.4982             nan     0.0100   -0.0001
##    400        0.4867             nan     0.0100   -0.0000
##    420        0.4751             nan     0.0100   -0.0000
##    440        0.4647             nan     0.0100    0.0000
##    460        0.4540             nan     0.0100    0.0001
##    480        0.4445             nan     0.0100   -0.0001
##    500        0.4347             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0042
##      2        1.3021             nan     0.0100    0.0041
##      3        1.2929             nan     0.0100    0.0040
##      4        1.2844             nan     0.0100    0.0040
##      5        1.2756             nan     0.0100    0.0037
##      6        1.2664             nan     0.0100    0.0041
##      7        1.2585             nan     0.0100    0.0036
##      8        1.2500             nan     0.0100    0.0038
##      9        1.2423             nan     0.0100    0.0034
##     10        1.2340             nan     0.0100    0.0036
##     20        1.1611             nan     0.0100    0.0027
##     40        1.0418             nan     0.0100    0.0023
##     60        0.9507             nan     0.0100    0.0018
##     80        0.8803             nan     0.0100    0.0011
##    100        0.8247             nan     0.0100    0.0008
##    120        0.7783             nan     0.0100    0.0006
##    140        0.7382             nan     0.0100    0.0007
##    160        0.7058             nan     0.0100    0.0002
##    180        0.6775             nan     0.0100    0.0002
##    200        0.6512             nan     0.0100    0.0003
##    220        0.6285             nan     0.0100    0.0002
##    240        0.6078             nan     0.0100    0.0002
##    260        0.5898             nan     0.0100    0.0002
##    280        0.5738             nan     0.0100   -0.0001
##    300        0.5583             nan     0.0100    0.0001
##    320        0.5444             nan     0.0100   -0.0001
##    340        0.5325             nan     0.0100   -0.0001
##    360        0.5196             nan     0.0100   -0.0001
##    380        0.5078             nan     0.0100   -0.0001
##    400        0.4959             nan     0.0100    0.0000
##    420        0.4852             nan     0.0100    0.0000
##    440        0.4751             nan     0.0100    0.0000
##    460        0.4652             nan     0.0100   -0.0001
##    480        0.4555             nan     0.0100    0.0000
##    500        0.4456             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0043
##      2        1.3022             nan     0.0100    0.0040
##      3        1.2926             nan     0.0100    0.0038
##      4        1.2837             nan     0.0100    0.0039
##      5        1.2749             nan     0.0100    0.0039
##      6        1.2663             nan     0.0100    0.0040
##      7        1.2578             nan     0.0100    0.0040
##      8        1.2498             nan     0.0100    0.0036
##      9        1.2415             nan     0.0100    0.0038
##     10        1.2335             nan     0.0100    0.0036
##     20        1.1595             nan     0.0100    0.0031
##     40        1.0460             nan     0.0100    0.0022
##     60        0.9569             nan     0.0100    0.0015
##     80        0.8875             nan     0.0100    0.0015
##    100        0.8322             nan     0.0100    0.0009
##    120        0.7857             nan     0.0100    0.0007
##    140        0.7480             nan     0.0100    0.0004
##    160        0.7154             nan     0.0100    0.0002
##    180        0.6872             nan     0.0100    0.0005
##    200        0.6618             nan     0.0100    0.0001
##    220        0.6403             nan     0.0100    0.0002
##    240        0.6211             nan     0.0100    0.0002
##    260        0.6017             nan     0.0100    0.0003
##    280        0.5849             nan     0.0100    0.0001
##    300        0.5695             nan     0.0100    0.0001
##    320        0.5553             nan     0.0100   -0.0001
##    340        0.5409             nan     0.0100    0.0001
##    360        0.5287             nan     0.0100    0.0001
##    380        0.5163             nan     0.0100    0.0000
##    400        0.5041             nan     0.0100   -0.0001
##    420        0.4928             nan     0.0100   -0.0001
##    440        0.4825             nan     0.0100    0.0000
##    460        0.4730             nan     0.0100   -0.0001
##    480        0.4628             nan     0.0100   -0.0002
##    500        0.4536             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2297             nan     0.1000    0.0421
##      2        1.1638             nan     0.1000    0.0294
##      3        1.1084             nan     0.1000    0.0219
##      4        1.0613             nan     0.1000    0.0222
##      5        1.0177             nan     0.1000    0.0181
##      6        0.9789             nan     0.1000    0.0173
##      7        0.9433             nan     0.1000    0.0127
##      8        0.9130             nan     0.1000    0.0111
##      9        0.8858             nan     0.1000    0.0098
##     10        0.8612             nan     0.1000    0.0086
##     20        0.7126             nan     0.1000    0.0013
##     40        0.5758             nan     0.1000   -0.0015
##     60        0.4973             nan     0.1000    0.0000
##     80        0.4313             nan     0.1000    0.0001
##    100        0.3831             nan     0.1000   -0.0009
##    120        0.3423             nan     0.1000    0.0001
##    140        0.3043             nan     0.1000   -0.0004
##    160        0.2726             nan     0.1000   -0.0005
##    180        0.2498             nan     0.1000   -0.0004
##    200        0.2264             nan     0.1000   -0.0002
##    220        0.2031             nan     0.1000   -0.0003
##    240        0.1838             nan     0.1000   -0.0006
##    260        0.1680             nan     0.1000   -0.0001
##    280        0.1526             nan     0.1000   -0.0002
##    300        0.1396             nan     0.1000   -0.0004
##    320        0.1288             nan     0.1000   -0.0004
##    340        0.1189             nan     0.1000   -0.0002
##    360        0.1080             nan     0.1000   -0.0004
##    380        0.0983             nan     0.1000   -0.0006
##    400        0.0904             nan     0.1000   -0.0003
##    420        0.0830             nan     0.1000   -0.0005
##    440        0.0767             nan     0.1000   -0.0001
##    460        0.0703             nan     0.1000   -0.0003
##    480        0.0656             nan     0.1000   -0.0002
##    500        0.0614             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2380             nan     0.1000    0.0347
##      2        1.1687             nan     0.1000    0.0325
##      3        1.1158             nan     0.1000    0.0212
##      4        1.0686             nan     0.1000    0.0212
##      5        1.0154             nan     0.1000    0.0194
##      6        0.9678             nan     0.1000    0.0179
##      7        0.9325             nan     0.1000    0.0147
##      8        0.9008             nan     0.1000    0.0116
##      9        0.8752             nan     0.1000    0.0104
##     10        0.8519             nan     0.1000    0.0082
##     20        0.7013             nan     0.1000    0.0046
##     40        0.5657             nan     0.1000   -0.0008
##     60        0.4913             nan     0.1000   -0.0006
##     80        0.4356             nan     0.1000    0.0001
##    100        0.3878             nan     0.1000   -0.0019
##    120        0.3449             nan     0.1000    0.0001
##    140        0.3118             nan     0.1000   -0.0008
##    160        0.2800             nan     0.1000   -0.0017
##    180        0.2486             nan     0.1000   -0.0005
##    200        0.2271             nan     0.1000   -0.0006
##    220        0.2058             nan     0.1000   -0.0003
##    240        0.1857             nan     0.1000   -0.0004
##    260        0.1708             nan     0.1000   -0.0006
##    280        0.1561             nan     0.1000   -0.0007
##    300        0.1447             nan     0.1000   -0.0005
##    320        0.1332             nan     0.1000   -0.0001
##    340        0.1207             nan     0.1000   -0.0003
##    360        0.1119             nan     0.1000   -0.0002
##    380        0.1038             nan     0.1000   -0.0002
##    400        0.0964             nan     0.1000   -0.0001
##    420        0.0884             nan     0.1000   -0.0002
##    440        0.0826             nan     0.1000   -0.0002
##    460        0.0769             nan     0.1000   -0.0001
##    480        0.0710             nan     0.1000   -0.0002
##    500        0.0651             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2411             nan     0.1000    0.0353
##      2        1.1711             nan     0.1000    0.0286
##      3        1.1163             nan     0.1000    0.0234
##      4        1.0634             nan     0.1000    0.0247
##      5        1.0209             nan     0.1000    0.0185
##      6        0.9785             nan     0.1000    0.0158
##      7        0.9494             nan     0.1000    0.0123
##      8        0.9170             nan     0.1000    0.0136
##      9        0.8907             nan     0.1000    0.0091
##     10        0.8653             nan     0.1000    0.0107
##     20        0.7169             nan     0.1000    0.0025
##     40        0.5883             nan     0.1000   -0.0003
##     60        0.5142             nan     0.1000   -0.0020
##     80        0.4600             nan     0.1000   -0.0013
##    100        0.4105             nan     0.1000   -0.0005
##    120        0.3670             nan     0.1000    0.0002
##    140        0.3302             nan     0.1000    0.0001
##    160        0.2980             nan     0.1000   -0.0005
##    180        0.2687             nan     0.1000   -0.0008
##    200        0.2452             nan     0.1000   -0.0009
##    220        0.2226             nan     0.1000   -0.0003
##    240        0.2044             nan     0.1000   -0.0010
##    260        0.1878             nan     0.1000   -0.0005
##    280        0.1728             nan     0.1000   -0.0007
##    300        0.1583             nan     0.1000   -0.0003
##    320        0.1454             nan     0.1000   -0.0006
##    340        0.1332             nan     0.1000   -0.0002
##    360        0.1232             nan     0.1000   -0.0003
##    380        0.1140             nan     0.1000   -0.0004
##    400        0.1054             nan     0.1000   -0.0002
##    420        0.0983             nan     0.1000   -0.0002
##    440        0.0912             nan     0.1000   -0.0003
##    460        0.0851             nan     0.1000   -0.0003
##    480        0.0784             nan     0.1000   -0.0001
##    500        0.0724             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2426             nan     0.1000    0.0358
##      2        1.1676             nan     0.1000    0.0341
##      3        1.1099             nan     0.1000    0.0262
##      4        1.0552             nan     0.1000    0.0247
##      5        1.0070             nan     0.1000    0.0219
##      6        0.9699             nan     0.1000    0.0137
##      7        0.9323             nan     0.1000    0.0150
##      8        0.8972             nan     0.1000    0.0159
##      9        0.8665             nan     0.1000    0.0106
##     10        0.8390             nan     0.1000    0.0116
##     20        0.6706             nan     0.1000    0.0014
##     40        0.5231             nan     0.1000    0.0002
##     60        0.4319             nan     0.1000   -0.0002
##     80        0.3595             nan     0.1000   -0.0004
##    100        0.3054             nan     0.1000   -0.0008
##    120        0.2664             nan     0.1000   -0.0006
##    140        0.2363             nan     0.1000   -0.0009
##    160        0.2061             nan     0.1000   -0.0006
##    180        0.1796             nan     0.1000   -0.0002
##    200        0.1614             nan     0.1000   -0.0004
##    220        0.1443             nan     0.1000   -0.0007
##    240        0.1289             nan     0.1000   -0.0000
##    260        0.1145             nan     0.1000   -0.0004
##    280        0.1019             nan     0.1000   -0.0002
##    300        0.0910             nan     0.1000   -0.0001
##    320        0.0824             nan     0.1000   -0.0003
##    340        0.0753             nan     0.1000   -0.0001
##    360        0.0676             nan     0.1000   -0.0003
##    380        0.0608             nan     0.1000   -0.0001
##    400        0.0557             nan     0.1000   -0.0002
##    420        0.0503             nan     0.1000   -0.0001
##    440        0.0451             nan     0.1000   -0.0001
##    460        0.0406             nan     0.1000   -0.0001
##    480        0.0366             nan     0.1000   -0.0000
##    500        0.0331             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2378             nan     0.1000    0.0400
##      2        1.1615             nan     0.1000    0.0321
##      3        1.0996             nan     0.1000    0.0264
##      4        1.0511             nan     0.1000    0.0221
##      5        1.0020             nan     0.1000    0.0210
##      6        0.9631             nan     0.1000    0.0154
##      7        0.9277             nan     0.1000    0.0171
##      8        0.8970             nan     0.1000    0.0135
##      9        0.8678             nan     0.1000    0.0101
##     10        0.8408             nan     0.1000    0.0099
##     20        0.6821             nan     0.1000    0.0039
##     40        0.5424             nan     0.1000    0.0003
##     60        0.4604             nan     0.1000   -0.0008
##     80        0.3896             nan     0.1000   -0.0008
##    100        0.3350             nan     0.1000   -0.0011
##    120        0.2931             nan     0.1000   -0.0012
##    140        0.2575             nan     0.1000   -0.0011
##    160        0.2269             nan     0.1000   -0.0015
##    180        0.1994             nan     0.1000   -0.0003
##    200        0.1777             nan     0.1000   -0.0005
##    220        0.1587             nan     0.1000   -0.0004
##    240        0.1420             nan     0.1000   -0.0010
##    260        0.1277             nan     0.1000   -0.0003
##    280        0.1141             nan     0.1000   -0.0003
##    300        0.1031             nan     0.1000   -0.0002
##    320        0.0935             nan     0.1000   -0.0001
##    340        0.0838             nan     0.1000   -0.0004
##    360        0.0769             nan     0.1000   -0.0001
##    380        0.0690             nan     0.1000   -0.0004
##    400        0.0619             nan     0.1000   -0.0001
##    420        0.0556             nan     0.1000   -0.0002
##    440        0.0507             nan     0.1000   -0.0002
##    460        0.0453             nan     0.1000   -0.0002
##    480        0.0413             nan     0.1000   -0.0001
##    500        0.0376             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2359             nan     0.1000    0.0374
##      2        1.1695             nan     0.1000    0.0302
##      3        1.1104             nan     0.1000    0.0260
##      4        1.0621             nan     0.1000    0.0221
##      5        1.0150             nan     0.1000    0.0206
##      6        0.9722             nan     0.1000    0.0184
##      7        0.9344             nan     0.1000    0.0156
##      8        0.9000             nan     0.1000    0.0161
##      9        0.8755             nan     0.1000    0.0087
##     10        0.8502             nan     0.1000    0.0100
##     20        0.6953             nan     0.1000    0.0009
##     40        0.5505             nan     0.1000   -0.0013
##     60        0.4680             nan     0.1000   -0.0021
##     80        0.4042             nan     0.1000   -0.0001
##    100        0.3445             nan     0.1000   -0.0004
##    120        0.3017             nan     0.1000   -0.0001
##    140        0.2641             nan     0.1000   -0.0008
##    160        0.2343             nan     0.1000   -0.0009
##    180        0.2064             nan     0.1000   -0.0010
##    200        0.1851             nan     0.1000   -0.0008
##    220        0.1663             nan     0.1000   -0.0012
##    240        0.1481             nan     0.1000   -0.0001
##    260        0.1325             nan     0.1000   -0.0000
##    280        0.1196             nan     0.1000   -0.0002
##    300        0.1084             nan     0.1000   -0.0004
##    320        0.0983             nan     0.1000   -0.0004
##    340        0.0890             nan     0.1000   -0.0001
##    360        0.0811             nan     0.1000   -0.0003
##    380        0.0735             nan     0.1000   -0.0002
##    400        0.0658             nan     0.1000   -0.0003
##    420        0.0598             nan     0.1000   -0.0005
##    440        0.0545             nan     0.1000   -0.0002
##    460        0.0496             nan     0.1000   -0.0002
##    480        0.0455             nan     0.1000   -0.0003
##    500        0.0416             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2306             nan     0.1000    0.0422
##      2        1.1549             nan     0.1000    0.0339
##      3        1.0931             nan     0.1000    0.0281
##      4        1.0332             nan     0.1000    0.0262
##      5        0.9899             nan     0.1000    0.0188
##      6        0.9524             nan     0.1000    0.0163
##      7        0.9108             nan     0.1000    0.0170
##      8        0.8768             nan     0.1000    0.0107
##      9        0.8490             nan     0.1000    0.0090
##     10        0.8207             nan     0.1000    0.0093
##     20        0.6518             nan     0.1000    0.0015
##     40        0.4941             nan     0.1000   -0.0005
##     60        0.3955             nan     0.1000   -0.0004
##     80        0.3235             nan     0.1000   -0.0007
##    100        0.2710             nan     0.1000    0.0003
##    120        0.2330             nan     0.1000   -0.0009
##    140        0.1964             nan     0.1000   -0.0006
##    160        0.1696             nan     0.1000   -0.0002
##    180        0.1466             nan     0.1000   -0.0007
##    200        0.1282             nan     0.1000   -0.0001
##    220        0.1113             nan     0.1000   -0.0003
##    240        0.0968             nan     0.1000   -0.0001
##    260        0.0849             nan     0.1000   -0.0006
##    280        0.0750             nan     0.1000   -0.0004
##    300        0.0654             nan     0.1000   -0.0002
##    320        0.0580             nan     0.1000   -0.0001
##    340        0.0508             nan     0.1000   -0.0001
##    360        0.0447             nan     0.1000   -0.0001
##    380        0.0402             nan     0.1000   -0.0001
##    400        0.0355             nan     0.1000   -0.0000
##    420        0.0314             nan     0.1000   -0.0001
##    440        0.0279             nan     0.1000   -0.0001
##    460        0.0249             nan     0.1000   -0.0001
##    480        0.0223             nan     0.1000   -0.0001
##    500        0.0198             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2294             nan     0.1000    0.0404
##      2        1.1547             nan     0.1000    0.0332
##      3        1.0949             nan     0.1000    0.0275
##      4        1.0395             nan     0.1000    0.0204
##      5        0.9935             nan     0.1000    0.0192
##      6        0.9530             nan     0.1000    0.0170
##      7        0.9154             nan     0.1000    0.0170
##      8        0.8823             nan     0.1000    0.0124
##      9        0.8497             nan     0.1000    0.0112
##     10        0.8229             nan     0.1000    0.0062
##     20        0.6575             nan     0.1000    0.0010
##     40        0.4956             nan     0.1000   -0.0004
##     60        0.4049             nan     0.1000   -0.0016
##     80        0.3380             nan     0.1000   -0.0012
##    100        0.2838             nan     0.1000   -0.0013
##    120        0.2401             nan     0.1000   -0.0010
##    140        0.2042             nan     0.1000   -0.0010
##    160        0.1759             nan     0.1000   -0.0008
##    180        0.1534             nan     0.1000   -0.0001
##    200        0.1334             nan     0.1000   -0.0006
##    220        0.1181             nan     0.1000   -0.0001
##    240        0.1042             nan     0.1000   -0.0004
##    260        0.0905             nan     0.1000   -0.0004
##    280        0.0796             nan     0.1000   -0.0001
##    300        0.0715             nan     0.1000   -0.0004
##    320        0.0630             nan     0.1000   -0.0001
##    340        0.0548             nan     0.1000   -0.0002
##    360        0.0485             nan     0.1000   -0.0003
##    380        0.0429             nan     0.1000   -0.0001
##    400        0.0376             nan     0.1000   -0.0000
##    420        0.0337             nan     0.1000   -0.0002
##    440        0.0300             nan     0.1000   -0.0000
##    460        0.0267             nan     0.1000   -0.0001
##    480        0.0239             nan     0.1000   -0.0002
##    500        0.0211             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2316             nan     0.1000    0.0393
##      2        1.1574             nan     0.1000    0.0311
##      3        1.0926             nan     0.1000    0.0271
##      4        1.0354             nan     0.1000    0.0210
##      5        0.9900             nan     0.1000    0.0206
##      6        0.9455             nan     0.1000    0.0184
##      7        0.9067             nan     0.1000    0.0178
##      8        0.8762             nan     0.1000    0.0097
##      9        0.8497             nan     0.1000    0.0081
##     10        0.8199             nan     0.1000    0.0101
##     20        0.6527             nan     0.1000    0.0007
##     40        0.5134             nan     0.1000   -0.0032
##     60        0.4165             nan     0.1000   -0.0002
##     80        0.3411             nan     0.1000   -0.0007
##    100        0.2878             nan     0.1000    0.0001
##    120        0.2473             nan     0.1000   -0.0006
##    140        0.2138             nan     0.1000   -0.0004
##    160        0.1866             nan     0.1000   -0.0014
##    180        0.1647             nan     0.1000   -0.0012
##    200        0.1439             nan     0.1000   -0.0003
##    220        0.1274             nan     0.1000   -0.0006
##    240        0.1114             nan     0.1000   -0.0009
##    260        0.0985             nan     0.1000   -0.0003
##    280        0.0853             nan     0.1000   -0.0001
##    300        0.0751             nan     0.1000   -0.0003
##    320        0.0675             nan     0.1000   -0.0001
##    340        0.0593             nan     0.1000   -0.0002
##    360        0.0532             nan     0.1000   -0.0003
##    380        0.0481             nan     0.1000   -0.0001
##    400        0.0428             nan     0.1000   -0.0002
##    420        0.0382             nan     0.1000   -0.0003
##    440        0.0342             nan     0.1000   -0.0001
##    460        0.0308             nan     0.1000   -0.0002
##    480        0.0280             nan     0.1000   -0.0002
##    500        0.0252             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2864             nan     0.0010    0.0004
##     60        1.2699             nan     0.0010    0.0004
##     80        1.2543             nan     0.0010    0.0004
##    100        1.2391             nan     0.0010    0.0003
##    120        1.2244             nan     0.0010    0.0003
##    140        1.2101             nan     0.0010    0.0003
##    160        1.1960             nan     0.0010    0.0003
##    180        1.1827             nan     0.0010    0.0003
##    200        1.1697             nan     0.0010    0.0003
##    220        1.1568             nan     0.0010    0.0003
##    240        1.1447             nan     0.0010    0.0003
##    260        1.1330             nan     0.0010    0.0003
##    280        1.1213             nan     0.0010    0.0003
##    300        1.1104             nan     0.0010    0.0002
##    320        1.0997             nan     0.0010    0.0002
##    340        1.0892             nan     0.0010    0.0003
##    360        1.0785             nan     0.0010    0.0002
##    380        1.0686             nan     0.0010    0.0002
##    400        1.0590             nan     0.0010    0.0002
##    420        1.0496             nan     0.0010    0.0002
##    440        1.0405             nan     0.0010    0.0002
##    460        1.0314             nan     0.0010    0.0002
##    480        1.0227             nan     0.0010    0.0002
##    500        1.0143             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0004
##     80        1.2544             nan     0.0010    0.0003
##    100        1.2391             nan     0.0010    0.0003
##    120        1.2244             nan     0.0010    0.0003
##    140        1.2104             nan     0.0010    0.0003
##    160        1.1966             nan     0.0010    0.0003
##    180        1.1832             nan     0.0010    0.0003
##    200        1.1704             nan     0.0010    0.0003
##    220        1.1579             nan     0.0010    0.0003
##    240        1.1459             nan     0.0010    0.0003
##    260        1.1341             nan     0.0010    0.0003
##    280        1.1228             nan     0.0010    0.0003
##    300        1.1118             nan     0.0010    0.0002
##    320        1.1010             nan     0.0010    0.0002
##    340        1.0904             nan     0.0010    0.0002
##    360        1.0800             nan     0.0010    0.0002
##    380        1.0700             nan     0.0010    0.0002
##    400        1.0602             nan     0.0010    0.0002
##    420        1.0509             nan     0.0010    0.0002
##    440        1.0421             nan     0.0010    0.0002
##    460        1.0331             nan     0.0010    0.0002
##    480        1.0243             nan     0.0010    0.0002
##    500        1.0161             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0004
##     60        1.2706             nan     0.0010    0.0004
##     80        1.2552             nan     0.0010    0.0003
##    100        1.2404             nan     0.0010    0.0003
##    120        1.2257             nan     0.0010    0.0003
##    140        1.2116             nan     0.0010    0.0003
##    160        1.1982             nan     0.0010    0.0003
##    180        1.1847             nan     0.0010    0.0003
##    200        1.1721             nan     0.0010    0.0003
##    220        1.1599             nan     0.0010    0.0003
##    240        1.1479             nan     0.0010    0.0003
##    260        1.1364             nan     0.0010    0.0002
##    280        1.1253             nan     0.0010    0.0003
##    300        1.1141             nan     0.0010    0.0002
##    320        1.1032             nan     0.0010    0.0002
##    340        1.0928             nan     0.0010    0.0002
##    360        1.0827             nan     0.0010    0.0002
##    380        1.0729             nan     0.0010    0.0002
##    400        1.0634             nan     0.0010    0.0002
##    420        1.0541             nan     0.0010    0.0002
##    440        1.0450             nan     0.0010    0.0002
##    460        1.0361             nan     0.0010    0.0002
##    480        1.0273             nan     0.0010    0.0002
##    500        1.0190             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0005
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0003
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0005
##     10        1.3115             nan     0.0010    0.0005
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2847             nan     0.0010    0.0004
##     60        1.2677             nan     0.0010    0.0004
##     80        1.2509             nan     0.0010    0.0004
##    100        1.2346             nan     0.0010    0.0003
##    120        1.2191             nan     0.0010    0.0004
##    140        1.2039             nan     0.0010    0.0004
##    160        1.1892             nan     0.0010    0.0003
##    180        1.1752             nan     0.0010    0.0003
##    200        1.1617             nan     0.0010    0.0003
##    220        1.1482             nan     0.0010    0.0003
##    240        1.1352             nan     0.0010    0.0003
##    260        1.1226             nan     0.0010    0.0003
##    280        1.1106             nan     0.0010    0.0002
##    300        1.0988             nan     0.0010    0.0003
##    320        1.0872             nan     0.0010    0.0003
##    340        1.0763             nan     0.0010    0.0002
##    360        1.0655             nan     0.0010    0.0002
##    380        1.0550             nan     0.0010    0.0002
##    400        1.0446             nan     0.0010    0.0002
##    420        1.0346             nan     0.0010    0.0002
##    440        1.0251             nan     0.0010    0.0002
##    460        1.0156             nan     0.0010    0.0002
##    480        1.0068             nan     0.0010    0.0002
##    500        0.9977             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0005
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0004
##     40        1.2847             nan     0.0010    0.0004
##     60        1.2673             nan     0.0010    0.0004
##     80        1.2507             nan     0.0010    0.0004
##    100        1.2350             nan     0.0010    0.0004
##    120        1.2195             nan     0.0010    0.0003
##    140        1.2046             nan     0.0010    0.0003
##    160        1.1902             nan     0.0010    0.0004
##    180        1.1762             nan     0.0010    0.0003
##    200        1.1625             nan     0.0010    0.0003
##    220        1.1495             nan     0.0010    0.0003
##    240        1.1367             nan     0.0010    0.0002
##    260        1.1244             nan     0.0010    0.0003
##    280        1.1126             nan     0.0010    0.0003
##    300        1.1007             nan     0.0010    0.0003
##    320        1.0891             nan     0.0010    0.0002
##    340        1.0780             nan     0.0010    0.0003
##    360        1.0674             nan     0.0010    0.0002
##    380        1.0569             nan     0.0010    0.0002
##    400        1.0466             nan     0.0010    0.0002
##    420        1.0369             nan     0.0010    0.0002
##    440        1.0272             nan     0.0010    0.0002
##    460        1.0179             nan     0.0010    0.0002
##    480        1.0089             nan     0.0010    0.0002
##    500        1.0001             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2859             nan     0.0010    0.0004
##     60        1.2693             nan     0.0010    0.0004
##     80        1.2529             nan     0.0010    0.0004
##    100        1.2372             nan     0.0010    0.0003
##    120        1.2220             nan     0.0010    0.0004
##    140        1.2072             nan     0.0010    0.0003
##    160        1.1933             nan     0.0010    0.0003
##    180        1.1796             nan     0.0010    0.0003
##    200        1.1660             nan     0.0010    0.0003
##    220        1.1532             nan     0.0010    0.0003
##    240        1.1403             nan     0.0010    0.0003
##    260        1.1281             nan     0.0010    0.0003
##    280        1.1162             nan     0.0010    0.0003
##    300        1.1048             nan     0.0010    0.0002
##    320        1.0936             nan     0.0010    0.0002
##    340        1.0826             nan     0.0010    0.0002
##    360        1.0720             nan     0.0010    0.0002
##    380        1.0612             nan     0.0010    0.0002
##    400        1.0512             nan     0.0010    0.0002
##    420        1.0413             nan     0.0010    0.0002
##    440        1.0318             nan     0.0010    0.0002
##    460        1.0225             nan     0.0010    0.0002
##    480        1.0135             nan     0.0010    0.0002
##    500        1.0048             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0005
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0004
##     20        1.3012             nan     0.0010    0.0004
##     40        1.2823             nan     0.0010    0.0004
##     60        1.2640             nan     0.0010    0.0004
##     80        1.2466             nan     0.0010    0.0003
##    100        1.2299             nan     0.0010    0.0004
##    120        1.2135             nan     0.0010    0.0004
##    140        1.1975             nan     0.0010    0.0003
##    160        1.1825             nan     0.0010    0.0004
##    180        1.1680             nan     0.0010    0.0003
##    200        1.1538             nan     0.0010    0.0003
##    220        1.1401             nan     0.0010    0.0003
##    240        1.1266             nan     0.0010    0.0003
##    260        1.1138             nan     0.0010    0.0003
##    280        1.1013             nan     0.0010    0.0002
##    300        1.0890             nan     0.0010    0.0003
##    320        1.0774             nan     0.0010    0.0002
##    340        1.0658             nan     0.0010    0.0002
##    360        1.0545             nan     0.0010    0.0002
##    380        1.0435             nan     0.0010    0.0002
##    400        1.0330             nan     0.0010    0.0002
##    420        1.0227             nan     0.0010    0.0002
##    440        1.0126             nan     0.0010    0.0002
##    460        1.0028             nan     0.0010    0.0002
##    480        0.9932             nan     0.0010    0.0002
##    500        0.9840             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0004
##     10        1.3109             nan     0.0010    0.0005
##     20        1.3016             nan     0.0010    0.0004
##     40        1.2833             nan     0.0010    0.0004
##     60        1.2655             nan     0.0010    0.0004
##     80        1.2482             nan     0.0010    0.0004
##    100        1.2317             nan     0.0010    0.0004
##    120        1.2155             nan     0.0010    0.0004
##    140        1.2002             nan     0.0010    0.0003
##    160        1.1851             nan     0.0010    0.0003
##    180        1.1707             nan     0.0010    0.0003
##    200        1.1566             nan     0.0010    0.0003
##    220        1.1428             nan     0.0010    0.0003
##    240        1.1294             nan     0.0010    0.0003
##    260        1.1165             nan     0.0010    0.0003
##    280        1.1042             nan     0.0010    0.0002
##    300        1.0918             nan     0.0010    0.0003
##    320        1.0801             nan     0.0010    0.0002
##    340        1.0687             nan     0.0010    0.0003
##    360        1.0573             nan     0.0010    0.0002
##    380        1.0466             nan     0.0010    0.0002
##    400        1.0362             nan     0.0010    0.0002
##    420        1.0260             nan     0.0010    0.0002
##    440        1.0160             nan     0.0010    0.0002
##    460        1.0063             nan     0.0010    0.0002
##    480        0.9968             nan     0.0010    0.0002
##    500        0.9877             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0005
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2839             nan     0.0010    0.0004
##     60        1.2666             nan     0.0010    0.0004
##     80        1.2498             nan     0.0010    0.0003
##    100        1.2335             nan     0.0010    0.0004
##    120        1.2177             nan     0.0010    0.0003
##    140        1.2026             nan     0.0010    0.0004
##    160        1.1880             nan     0.0010    0.0003
##    180        1.1735             nan     0.0010    0.0003
##    200        1.1596             nan     0.0010    0.0003
##    220        1.1460             nan     0.0010    0.0003
##    240        1.1330             nan     0.0010    0.0003
##    260        1.1201             nan     0.0010    0.0003
##    280        1.1077             nan     0.0010    0.0003
##    300        1.0961             nan     0.0010    0.0002
##    320        1.0847             nan     0.0010    0.0002
##    340        1.0734             nan     0.0010    0.0002
##    360        1.0626             nan     0.0010    0.0002
##    380        1.0516             nan     0.0010    0.0003
##    400        1.0413             nan     0.0010    0.0002
##    420        1.0314             nan     0.0010    0.0002
##    440        1.0215             nan     0.0010    0.0002
##    460        1.0120             nan     0.0010    0.0002
##    480        1.0026             nan     0.0010    0.0002
##    500        0.9937             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0034
##      2        1.3038             nan     0.0100    0.0041
##      3        1.2946             nan     0.0100    0.0043
##      4        1.2858             nan     0.0100    0.0038
##      5        1.2778             nan     0.0100    0.0037
##      6        1.2695             nan     0.0100    0.0041
##      7        1.2609             nan     0.0100    0.0039
##      8        1.2536             nan     0.0100    0.0032
##      9        1.2456             nan     0.0100    0.0038
##     10        1.2383             nan     0.0100    0.0033
##     20        1.1696             nan     0.0100    0.0023
##     40        1.0596             nan     0.0100    0.0020
##     60        0.9749             nan     0.0100    0.0015
##     80        0.9086             nan     0.0100    0.0011
##    100        0.8551             nan     0.0100    0.0011
##    120        0.8111             nan     0.0100    0.0005
##    140        0.7764             nan     0.0100    0.0005
##    160        0.7466             nan     0.0100    0.0003
##    180        0.7220             nan     0.0100    0.0002
##    200        0.7003             nan     0.0100    0.0002
##    220        0.6801             nan     0.0100    0.0001
##    240        0.6637             nan     0.0100    0.0001
##    260        0.6480             nan     0.0100   -0.0000
##    280        0.6340             nan     0.0100    0.0001
##    300        0.6208             nan     0.0100    0.0001
##    320        0.6095             nan     0.0100    0.0001
##    340        0.5993             nan     0.0100   -0.0000
##    360        0.5889             nan     0.0100   -0.0001
##    380        0.5786             nan     0.0100   -0.0002
##    400        0.5689             nan     0.0100    0.0000
##    420        0.5600             nan     0.0100   -0.0000
##    440        0.5506             nan     0.0100    0.0001
##    460        0.5416             nan     0.0100   -0.0000
##    480        0.5336             nan     0.0100   -0.0000
##    500        0.5262             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3026             nan     0.0100    0.0040
##      3        1.2943             nan     0.0100    0.0038
##      4        1.2866             nan     0.0100    0.0035
##      5        1.2775             nan     0.0100    0.0041
##      6        1.2691             nan     0.0100    0.0037
##      7        1.2618             nan     0.0100    0.0033
##      8        1.2542             nan     0.0100    0.0035
##      9        1.2463             nan     0.0100    0.0035
##     10        1.2388             nan     0.0100    0.0036
##     20        1.1700             nan     0.0100    0.0032
##     40        1.0600             nan     0.0100    0.0018
##     60        0.9775             nan     0.0100    0.0017
##     80        0.9119             nan     0.0100    0.0009
##    100        0.8589             nan     0.0100    0.0008
##    120        0.8172             nan     0.0100    0.0007
##    140        0.7813             nan     0.0100    0.0005
##    160        0.7530             nan     0.0100    0.0003
##    180        0.7288             nan     0.0100    0.0003
##    200        0.7081             nan     0.0100    0.0003
##    220        0.6880             nan     0.0100    0.0001
##    240        0.6711             nan     0.0100    0.0001
##    260        0.6553             nan     0.0100    0.0001
##    280        0.6422             nan     0.0100    0.0001
##    300        0.6297             nan     0.0100    0.0001
##    320        0.6174             nan     0.0100    0.0001
##    340        0.6061             nan     0.0100    0.0000
##    360        0.5956             nan     0.0100    0.0000
##    380        0.5856             nan     0.0100    0.0000
##    400        0.5755             nan     0.0100   -0.0001
##    420        0.5666             nan     0.0100    0.0001
##    440        0.5582             nan     0.0100   -0.0001
##    460        0.5495             nan     0.0100    0.0000
##    480        0.5411             nan     0.0100   -0.0000
##    500        0.5340             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0037
##      2        1.3035             nan     0.0100    0.0038
##      3        1.2946             nan     0.0100    0.0045
##      4        1.2868             nan     0.0100    0.0034
##      5        1.2789             nan     0.0100    0.0034
##      6        1.2714             nan     0.0100    0.0036
##      7        1.2642             nan     0.0100    0.0034
##      8        1.2567             nan     0.0100    0.0029
##      9        1.2495             nan     0.0100    0.0034
##     10        1.2419             nan     0.0100    0.0031
##     20        1.1708             nan     0.0100    0.0030
##     40        1.0609             nan     0.0100    0.0020
##     60        0.9789             nan     0.0100    0.0015
##     80        0.9144             nan     0.0100    0.0012
##    100        0.8630             nan     0.0100    0.0007
##    120        0.8212             nan     0.0100    0.0007
##    140        0.7876             nan     0.0100    0.0004
##    160        0.7594             nan     0.0100    0.0004
##    180        0.7353             nan     0.0100    0.0003
##    200        0.7133             nan     0.0100    0.0003
##    220        0.6948             nan     0.0100    0.0001
##    240        0.6778             nan     0.0100    0.0001
##    260        0.6620             nan     0.0100    0.0001
##    280        0.6482             nan     0.0100    0.0001
##    300        0.6360             nan     0.0100    0.0000
##    320        0.6251             nan     0.0100    0.0000
##    340        0.6138             nan     0.0100    0.0000
##    360        0.6042             nan     0.0100   -0.0001
##    380        0.5934             nan     0.0100   -0.0000
##    400        0.5841             nan     0.0100    0.0000
##    420        0.5752             nan     0.0100    0.0001
##    440        0.5672             nan     0.0100   -0.0001
##    460        0.5592             nan     0.0100    0.0000
##    480        0.5515             nan     0.0100   -0.0001
##    500        0.5441             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0043
##      2        1.3019             nan     0.0100    0.0043
##      3        1.2930             nan     0.0100    0.0040
##      4        1.2844             nan     0.0100    0.0038
##      5        1.2755             nan     0.0100    0.0042
##      6        1.2667             nan     0.0100    0.0040
##      7        1.2581             nan     0.0100    0.0040
##      8        1.2497             nan     0.0100    0.0041
##      9        1.2412             nan     0.0100    0.0035
##     10        1.2337             nan     0.0100    0.0030
##     20        1.1590             nan     0.0100    0.0032
##     40        1.0439             nan     0.0100    0.0022
##     60        0.9549             nan     0.0100    0.0014
##     80        0.8858             nan     0.0100    0.0012
##    100        0.8317             nan     0.0100    0.0009
##    120        0.7865             nan     0.0100    0.0009
##    140        0.7492             nan     0.0100    0.0004
##    160        0.7189             nan     0.0100    0.0005
##    180        0.6910             nan     0.0100    0.0001
##    200        0.6670             nan     0.0100    0.0003
##    220        0.6448             nan     0.0100    0.0000
##    240        0.6262             nan     0.0100    0.0003
##    260        0.6092             nan     0.0100    0.0002
##    280        0.5947             nan     0.0100   -0.0003
##    300        0.5819             nan     0.0100    0.0000
##    320        0.5688             nan     0.0100   -0.0000
##    340        0.5566             nan     0.0100    0.0000
##    360        0.5461             nan     0.0100   -0.0001
##    380        0.5353             nan     0.0100   -0.0001
##    400        0.5245             nan     0.0100   -0.0000
##    420        0.5149             nan     0.0100   -0.0001
##    440        0.5048             nan     0.0100   -0.0000
##    460        0.4960             nan     0.0100   -0.0001
##    480        0.4878             nan     0.0100   -0.0002
##    500        0.4793             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0038
##      2        1.3021             nan     0.0100    0.0044
##      3        1.2934             nan     0.0100    0.0042
##      4        1.2839             nan     0.0100    0.0041
##      5        1.2746             nan     0.0100    0.0038
##      6        1.2660             nan     0.0100    0.0036
##      7        1.2570             nan     0.0100    0.0042
##      8        1.2488             nan     0.0100    0.0037
##      9        1.2407             nan     0.0100    0.0039
##     10        1.2320             nan     0.0100    0.0039
##     20        1.1566             nan     0.0100    0.0030
##     40        1.0435             nan     0.0100    0.0018
##     60        0.9564             nan     0.0100    0.0017
##     80        0.8892             nan     0.0100    0.0015
##    100        0.8339             nan     0.0100    0.0010
##    120        0.7923             nan     0.0100    0.0008
##    140        0.7572             nan     0.0100    0.0004
##    160        0.7268             nan     0.0100    0.0003
##    180        0.7002             nan     0.0100    0.0004
##    200        0.6775             nan     0.0100    0.0001
##    220        0.6571             nan     0.0100    0.0001
##    240        0.6394             nan     0.0100    0.0001
##    260        0.6235             nan     0.0100    0.0000
##    280        0.6088             nan     0.0100    0.0002
##    300        0.5958             nan     0.0100    0.0002
##    320        0.5827             nan     0.0100   -0.0002
##    340        0.5708             nan     0.0100   -0.0000
##    360        0.5595             nan     0.0100   -0.0000
##    380        0.5491             nan     0.0100    0.0000
##    400        0.5383             nan     0.0100   -0.0001
##    420        0.5286             nan     0.0100   -0.0001
##    440        0.5188             nan     0.0100   -0.0001
##    460        0.5089             nan     0.0100    0.0000
##    480        0.5004             nan     0.0100    0.0001
##    500        0.4919             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0041
##      2        1.3027             nan     0.0100    0.0041
##      3        1.2940             nan     0.0100    0.0039
##      4        1.2855             nan     0.0100    0.0041
##      5        1.2774             nan     0.0100    0.0034
##      6        1.2689             nan     0.0100    0.0039
##      7        1.2607             nan     0.0100    0.0037
##      8        1.2528             nan     0.0100    0.0038
##      9        1.2448             nan     0.0100    0.0035
##     10        1.2371             nan     0.0100    0.0036
##     20        1.1661             nan     0.0100    0.0028
##     40        1.0493             nan     0.0100    0.0022
##     60        0.9617             nan     0.0100    0.0018
##     80        0.8919             nan     0.0100    0.0011
##    100        0.8382             nan     0.0100    0.0010
##    120        0.7930             nan     0.0100    0.0006
##    140        0.7580             nan     0.0100    0.0003
##    160        0.7287             nan     0.0100    0.0004
##    180        0.7041             nan     0.0100    0.0004
##    200        0.6839             nan     0.0100    0.0002
##    220        0.6638             nan     0.0100    0.0001
##    240        0.6463             nan     0.0100   -0.0001
##    260        0.6305             nan     0.0100    0.0002
##    280        0.6168             nan     0.0100    0.0001
##    300        0.6028             nan     0.0100    0.0002
##    320        0.5908             nan     0.0100   -0.0000
##    340        0.5796             nan     0.0100   -0.0001
##    360        0.5681             nan     0.0100    0.0000
##    380        0.5571             nan     0.0100   -0.0001
##    400        0.5475             nan     0.0100   -0.0000
##    420        0.5377             nan     0.0100    0.0000
##    440        0.5282             nan     0.0100   -0.0000
##    460        0.5181             nan     0.0100   -0.0000
##    480        0.5091             nan     0.0100    0.0001
##    500        0.5002             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3102             nan     0.0100    0.0041
##      2        1.3005             nan     0.0100    0.0042
##      3        1.2915             nan     0.0100    0.0040
##      4        1.2823             nan     0.0100    0.0042
##      5        1.2731             nan     0.0100    0.0041
##      6        1.2640             nan     0.0100    0.0041
##      7        1.2544             nan     0.0100    0.0041
##      8        1.2452             nan     0.0100    0.0042
##      9        1.2365             nan     0.0100    0.0040
##     10        1.2281             nan     0.0100    0.0038
##     20        1.1507             nan     0.0100    0.0030
##     40        1.0283             nan     0.0100    0.0020
##     60        0.9383             nan     0.0100    0.0016
##     80        0.8666             nan     0.0100    0.0010
##    100        0.8113             nan     0.0100    0.0007
##    120        0.7646             nan     0.0100    0.0007
##    140        0.7264             nan     0.0100    0.0005
##    160        0.6946             nan     0.0100    0.0004
##    180        0.6647             nan     0.0100    0.0003
##    200        0.6392             nan     0.0100    0.0004
##    220        0.6172             nan     0.0100    0.0002
##    240        0.5980             nan     0.0100    0.0000
##    260        0.5804             nan     0.0100    0.0001
##    280        0.5634             nan     0.0100    0.0001
##    300        0.5477             nan     0.0100    0.0000
##    320        0.5343             nan     0.0100   -0.0000
##    340        0.5206             nan     0.0100    0.0000
##    360        0.5077             nan     0.0100   -0.0000
##    380        0.4947             nan     0.0100    0.0002
##    400        0.4825             nan     0.0100    0.0002
##    420        0.4712             nan     0.0100    0.0002
##    440        0.4610             nan     0.0100    0.0000
##    460        0.4507             nan     0.0100   -0.0001
##    480        0.4414             nan     0.0100   -0.0001
##    500        0.4317             nan     0.0100    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0042
##      2        1.3015             nan     0.0100    0.0044
##      3        1.2912             nan     0.0100    0.0047
##      4        1.2819             nan     0.0100    0.0043
##      5        1.2733             nan     0.0100    0.0041
##      6        1.2645             nan     0.0100    0.0042
##      7        1.2563             nan     0.0100    0.0036
##      8        1.2472             nan     0.0100    0.0039
##      9        1.2394             nan     0.0100    0.0032
##     10        1.2306             nan     0.0100    0.0038
##     20        1.1561             nan     0.0100    0.0032
##     40        1.0346             nan     0.0100    0.0022
##     60        0.9442             nan     0.0100    0.0016
##     80        0.8721             nan     0.0100    0.0013
##    100        0.8163             nan     0.0100    0.0009
##    120        0.7695             nan     0.0100    0.0007
##    140        0.7311             nan     0.0100    0.0005
##    160        0.6998             nan     0.0100    0.0005
##    180        0.6719             nan     0.0100    0.0003
##    200        0.6486             nan     0.0100    0.0002
##    220        0.6277             nan     0.0100    0.0002
##    240        0.6084             nan     0.0100    0.0003
##    260        0.5894             nan     0.0100    0.0001
##    280        0.5719             nan     0.0100    0.0003
##    300        0.5568             nan     0.0100    0.0001
##    320        0.5434             nan     0.0100    0.0000
##    340        0.5303             nan     0.0100    0.0001
##    360        0.5175             nan     0.0100   -0.0000
##    380        0.5054             nan     0.0100   -0.0001
##    400        0.4933             nan     0.0100   -0.0000
##    420        0.4822             nan     0.0100    0.0001
##    440        0.4723             nan     0.0100    0.0001
##    460        0.4628             nan     0.0100    0.0001
##    480        0.4538             nan     0.0100   -0.0002
##    500        0.4440             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3101             nan     0.0100    0.0048
##      2        1.3010             nan     0.0100    0.0042
##      3        1.2920             nan     0.0100    0.0044
##      4        1.2826             nan     0.0100    0.0042
##      5        1.2740             nan     0.0100    0.0037
##      6        1.2652             nan     0.0100    0.0038
##      7        1.2561             nan     0.0100    0.0039
##      8        1.2474             nan     0.0100    0.0039
##      9        1.2395             nan     0.0100    0.0034
##     10        1.2318             nan     0.0100    0.0035
##     20        1.1569             nan     0.0100    0.0031
##     40        1.0378             nan     0.0100    0.0018
##     60        0.9495             nan     0.0100    0.0015
##     80        0.8798             nan     0.0100    0.0014
##    100        0.8245             nan     0.0100    0.0012
##    120        0.7781             nan     0.0100    0.0007
##    140        0.7395             nan     0.0100    0.0005
##    160        0.7082             nan     0.0100    0.0004
##    180        0.6813             nan     0.0100    0.0003
##    200        0.6588             nan     0.0100    0.0003
##    220        0.6371             nan     0.0100    0.0002
##    240        0.6187             nan     0.0100    0.0000
##    260        0.6018             nan     0.0100   -0.0000
##    280        0.5861             nan     0.0100   -0.0000
##    300        0.5714             nan     0.0100    0.0002
##    320        0.5580             nan     0.0100    0.0001
##    340        0.5453             nan     0.0100   -0.0000
##    360        0.5342             nan     0.0100   -0.0001
##    380        0.5230             nan     0.0100   -0.0001
##    400        0.5118             nan     0.0100   -0.0001
##    420        0.5021             nan     0.0100    0.0000
##    440        0.4924             nan     0.0100   -0.0001
##    460        0.4824             nan     0.0100   -0.0001
##    480        0.4731             nan     0.0100   -0.0002
##    500        0.4643             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2360             nan     0.1000    0.0402
##      2        1.1683             nan     0.1000    0.0305
##      3        1.1064             nan     0.1000    0.0307
##      4        1.0539             nan     0.1000    0.0215
##      5        1.0074             nan     0.1000    0.0200
##      6        0.9648             nan     0.1000    0.0186
##      7        0.9310             nan     0.1000    0.0101
##      8        0.8990             nan     0.1000    0.0128
##      9        0.8709             nan     0.1000    0.0105
##     10        0.8478             nan     0.1000    0.0081
##     20        0.6969             nan     0.1000    0.0007
##     40        0.5698             nan     0.1000   -0.0004
##     60        0.4949             nan     0.1000   -0.0005
##     80        0.4359             nan     0.1000   -0.0014
##    100        0.3805             nan     0.1000   -0.0002
##    120        0.3358             nan     0.1000   -0.0009
##    140        0.2991             nan     0.1000   -0.0004
##    160        0.2675             nan     0.1000   -0.0006
##    180        0.2436             nan     0.1000   -0.0001
##    200        0.2218             nan     0.1000   -0.0008
##    220        0.2019             nan     0.1000   -0.0002
##    240        0.1830             nan     0.1000   -0.0007
##    260        0.1668             nan     0.1000   -0.0003
##    280        0.1520             nan     0.1000   -0.0003
##    300        0.1393             nan     0.1000   -0.0002
##    320        0.1279             nan     0.1000    0.0001
##    340        0.1172             nan     0.1000   -0.0002
##    360        0.1090             nan     0.1000   -0.0003
##    380        0.1004             nan     0.1000   -0.0003
##    400        0.0925             nan     0.1000   -0.0005
##    420        0.0854             nan     0.1000   -0.0001
##    440        0.0790             nan     0.1000   -0.0001
##    460        0.0719             nan     0.1000   -0.0001
##    480        0.0666             nan     0.1000   -0.0001
##    500        0.0618             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2452             nan     0.1000    0.0344
##      2        1.1714             nan     0.1000    0.0347
##      3        1.1087             nan     0.1000    0.0291
##      4        1.0550             nan     0.1000    0.0196
##      5        1.0128             nan     0.1000    0.0177
##      6        0.9734             nan     0.1000    0.0161
##      7        0.9379             nan     0.1000    0.0167
##      8        0.9016             nan     0.1000    0.0128
##      9        0.8754             nan     0.1000    0.0115
##     10        0.8528             nan     0.1000    0.0087
##     20        0.7109             nan     0.1000    0.0021
##     40        0.5813             nan     0.1000   -0.0007
##     60        0.5016             nan     0.1000   -0.0004
##     80        0.4404             nan     0.1000   -0.0020
##    100        0.3875             nan     0.1000   -0.0006
##    120        0.3450             nan     0.1000   -0.0015
##    140        0.3048             nan     0.1000   -0.0002
##    160        0.2733             nan     0.1000   -0.0008
##    180        0.2487             nan     0.1000   -0.0018
##    200        0.2238             nan     0.1000   -0.0008
##    220        0.2045             nan     0.1000   -0.0004
##    240        0.1864             nan     0.1000   -0.0006
##    260        0.1684             nan     0.1000   -0.0005
##    280        0.1561             nan     0.1000   -0.0008
##    300        0.1433             nan     0.1000   -0.0005
##    320        0.1309             nan     0.1000   -0.0000
##    340        0.1211             nan     0.1000   -0.0005
##    360        0.1118             nan     0.1000   -0.0003
##    380        0.1042             nan     0.1000   -0.0001
##    400        0.0950             nan     0.1000   -0.0002
##    420        0.0874             nan     0.1000   -0.0004
##    440        0.0806             nan     0.1000   -0.0002
##    460        0.0754             nan     0.1000   -0.0004
##    480        0.0697             nan     0.1000   -0.0001
##    500        0.0641             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2348             nan     0.1000    0.0428
##      2        1.1717             nan     0.1000    0.0274
##      3        1.1124             nan     0.1000    0.0236
##      4        1.0579             nan     0.1000    0.0236
##      5        1.0154             nan     0.1000    0.0185
##      6        0.9738             nan     0.1000    0.0169
##      7        0.9417             nan     0.1000    0.0107
##      8        0.9123             nan     0.1000    0.0134
##      9        0.8864             nan     0.1000    0.0087
##     10        0.8629             nan     0.1000    0.0087
##     20        0.7096             nan     0.1000    0.0013
##     40        0.5977             nan     0.1000    0.0009
##     60        0.5186             nan     0.1000   -0.0008
##     80        0.4521             nan     0.1000   -0.0000
##    100        0.3970             nan     0.1000    0.0006
##    120        0.3647             nan     0.1000   -0.0007
##    140        0.3262             nan     0.1000   -0.0002
##    160        0.2954             nan     0.1000   -0.0003
##    180        0.2703             nan     0.1000   -0.0012
##    200        0.2484             nan     0.1000   -0.0007
##    220        0.2282             nan     0.1000   -0.0007
##    240        0.2087             nan     0.1000   -0.0008
##    260        0.1905             nan     0.1000    0.0001
##    280        0.1750             nan     0.1000   -0.0002
##    300        0.1609             nan     0.1000   -0.0007
##    320        0.1478             nan     0.1000   -0.0005
##    340        0.1359             nan     0.1000   -0.0007
##    360        0.1254             nan     0.1000   -0.0005
##    380        0.1152             nan     0.1000   -0.0005
##    400        0.1061             nan     0.1000   -0.0005
##    420        0.0988             nan     0.1000   -0.0003
##    440        0.0914             nan     0.1000   -0.0003
##    460        0.0840             nan     0.1000   -0.0002
##    480        0.0779             nan     0.1000   -0.0003
##    500        0.0730             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2311             nan     0.1000    0.0346
##      2        1.1573             nan     0.1000    0.0330
##      3        1.0909             nan     0.1000    0.0297
##      4        1.0416             nan     0.1000    0.0221
##      5        0.9951             nan     0.1000    0.0192
##      6        0.9564             nan     0.1000    0.0166
##      7        0.9235             nan     0.1000    0.0141
##      8        0.8902             nan     0.1000    0.0141
##      9        0.8615             nan     0.1000    0.0109
##     10        0.8372             nan     0.1000    0.0093
##     20        0.6817             nan     0.1000    0.0005
##     40        0.5316             nan     0.1000   -0.0006
##     60        0.4421             nan     0.1000   -0.0015
##     80        0.3796             nan     0.1000   -0.0002
##    100        0.3264             nan     0.1000   -0.0002
##    120        0.2844             nan     0.1000   -0.0003
##    140        0.2426             nan     0.1000   -0.0008
##    160        0.2118             nan     0.1000   -0.0003
##    180        0.1916             nan     0.1000    0.0001
##    200        0.1707             nan     0.1000   -0.0003
##    220        0.1524             nan     0.1000   -0.0007
##    240        0.1344             nan     0.1000   -0.0004
##    260        0.1204             nan     0.1000   -0.0007
##    280        0.1065             nan     0.1000   -0.0001
##    300        0.0958             nan     0.1000   -0.0001
##    320        0.0862             nan     0.1000   -0.0002
##    340        0.0778             nan     0.1000   -0.0002
##    360        0.0700             nan     0.1000   -0.0003
##    380        0.0637             nan     0.1000   -0.0001
##    400        0.0577             nan     0.1000   -0.0001
##    420        0.0524             nan     0.1000   -0.0000
##    440        0.0481             nan     0.1000   -0.0002
##    460        0.0431             nan     0.1000   -0.0000
##    480        0.0388             nan     0.1000   -0.0000
##    500        0.0355             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2373             nan     0.1000    0.0405
##      2        1.1639             nan     0.1000    0.0324
##      3        1.1015             nan     0.1000    0.0281
##      4        1.0484             nan     0.1000    0.0210
##      5        1.0029             nan     0.1000    0.0201
##      6        0.9600             nan     0.1000    0.0159
##      7        0.9249             nan     0.1000    0.0154
##      8        0.8939             nan     0.1000    0.0134
##      9        0.8645             nan     0.1000    0.0122
##     10        0.8407             nan     0.1000    0.0108
##     20        0.6863             nan     0.1000    0.0041
##     40        0.5444             nan     0.1000   -0.0001
##     60        0.4535             nan     0.1000   -0.0010
##     80        0.3910             nan     0.1000   -0.0014
##    100        0.3413             nan     0.1000   -0.0005
##    120        0.2966             nan     0.1000   -0.0005
##    140        0.2612             nan     0.1000   -0.0007
##    160        0.2282             nan     0.1000   -0.0004
##    180        0.1996             nan     0.1000   -0.0006
##    200        0.1745             nan     0.1000   -0.0005
##    220        0.1546             nan     0.1000   -0.0006
##    240        0.1381             nan     0.1000   -0.0005
##    260        0.1232             nan     0.1000   -0.0001
##    280        0.1102             nan     0.1000   -0.0005
##    300        0.0978             nan     0.1000    0.0001
##    320        0.0879             nan     0.1000   -0.0002
##    340        0.0801             nan     0.1000   -0.0003
##    360        0.0724             nan     0.1000   -0.0003
##    380        0.0654             nan     0.1000   -0.0002
##    400        0.0589             nan     0.1000   -0.0002
##    420        0.0532             nan     0.1000   -0.0001
##    440        0.0486             nan     0.1000   -0.0003
##    460        0.0437             nan     0.1000    0.0000
##    480        0.0403             nan     0.1000   -0.0001
##    500        0.0364             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2275             nan     0.1000    0.0377
##      2        1.1582             nan     0.1000    0.0295
##      3        1.0990             nan     0.1000    0.0238
##      4        1.0547             nan     0.1000    0.0191
##      5        1.0019             nan     0.1000    0.0208
##      6        0.9658             nan     0.1000    0.0114
##      7        0.9262             nan     0.1000    0.0150
##      8        0.8923             nan     0.1000    0.0145
##      9        0.8662             nan     0.1000    0.0099
##     10        0.8400             nan     0.1000    0.0075
##     20        0.6911             nan     0.1000    0.0019
##     40        0.5566             nan     0.1000   -0.0012
##     60        0.4731             nan     0.1000   -0.0001
##     80        0.4049             nan     0.1000   -0.0008
##    100        0.3464             nan     0.1000   -0.0011
##    120        0.3010             nan     0.1000   -0.0011
##    140        0.2658             nan     0.1000   -0.0011
##    160        0.2350             nan     0.1000   -0.0005
##    180        0.2068             nan     0.1000   -0.0012
##    200        0.1848             nan     0.1000   -0.0012
##    220        0.1654             nan     0.1000   -0.0010
##    240        0.1479             nan     0.1000   -0.0007
##    260        0.1341             nan     0.1000   -0.0006
##    280        0.1201             nan     0.1000   -0.0006
##    300        0.1079             nan     0.1000   -0.0005
##    320        0.0977             nan     0.1000   -0.0003
##    340        0.0880             nan     0.1000   -0.0002
##    360        0.0803             nan     0.1000   -0.0005
##    380        0.0726             nan     0.1000   -0.0003
##    400        0.0662             nan     0.1000   -0.0004
##    420        0.0607             nan     0.1000   -0.0001
##    440        0.0556             nan     0.1000   -0.0002
##    460        0.0503             nan     0.1000   -0.0002
##    480        0.0463             nan     0.1000   -0.0002
##    500        0.0423             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2289             nan     0.1000    0.0437
##      2        1.1502             nan     0.1000    0.0301
##      3        1.0861             nan     0.1000    0.0285
##      4        1.0256             nan     0.1000    0.0237
##      5        0.9811             nan     0.1000    0.0183
##      6        0.9349             nan     0.1000    0.0198
##      7        0.8934             nan     0.1000    0.0185
##      8        0.8628             nan     0.1000    0.0128
##      9        0.8328             nan     0.1000    0.0098
##     10        0.8031             nan     0.1000    0.0119
##     20        0.6356             nan     0.1000    0.0029
##     40        0.4870             nan     0.1000    0.0005
##     60        0.3925             nan     0.1000    0.0001
##     80        0.3313             nan     0.1000   -0.0003
##    100        0.2798             nan     0.1000   -0.0001
##    120        0.2322             nan     0.1000    0.0006
##    140        0.1996             nan     0.1000   -0.0013
##    160        0.1687             nan     0.1000   -0.0006
##    180        0.1448             nan     0.1000   -0.0001
##    200        0.1269             nan     0.1000   -0.0006
##    220        0.1102             nan     0.1000   -0.0001
##    240        0.0964             nan     0.1000   -0.0003
##    260        0.0843             nan     0.1000   -0.0002
##    280        0.0740             nan     0.1000   -0.0000
##    300        0.0653             nan     0.1000   -0.0002
##    320        0.0575             nan     0.1000   -0.0000
##    340        0.0507             nan     0.1000   -0.0002
##    360        0.0448             nan     0.1000   -0.0001
##    380        0.0397             nan     0.1000   -0.0001
##    400        0.0353             nan     0.1000   -0.0001
##    420        0.0312             nan     0.1000   -0.0001
##    440        0.0278             nan     0.1000   -0.0001
##    460        0.0248             nan     0.1000   -0.0000
##    480        0.0215             nan     0.1000   -0.0000
##    500        0.0191             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2175             nan     0.1000    0.0444
##      2        1.1461             nan     0.1000    0.0305
##      3        1.0787             nan     0.1000    0.0294
##      4        1.0245             nan     0.1000    0.0236
##      5        0.9767             nan     0.1000    0.0211
##      6        0.9341             nan     0.1000    0.0172
##      7        0.8926             nan     0.1000    0.0159
##      8        0.8619             nan     0.1000    0.0126
##      9        0.8333             nan     0.1000    0.0131
##     10        0.8083             nan     0.1000    0.0090
##     20        0.6507             nan     0.1000    0.0016
##     40        0.4908             nan     0.1000   -0.0004
##     60        0.4022             nan     0.1000   -0.0001
##     80        0.3316             nan     0.1000   -0.0000
##    100        0.2780             nan     0.1000   -0.0005
##    120        0.2377             nan     0.1000   -0.0003
##    140        0.2037             nan     0.1000   -0.0002
##    160        0.1751             nan     0.1000   -0.0008
##    180        0.1513             nan     0.1000   -0.0008
##    200        0.1304             nan     0.1000   -0.0004
##    220        0.1127             nan     0.1000   -0.0003
##    240        0.0980             nan     0.1000   -0.0001
##    260        0.0864             nan     0.1000   -0.0001
##    280        0.0754             nan     0.1000   -0.0003
##    300        0.0651             nan     0.1000   -0.0002
##    320        0.0576             nan     0.1000   -0.0002
##    340        0.0516             nan     0.1000   -0.0001
##    360        0.0453             nan     0.1000   -0.0002
##    380        0.0404             nan     0.1000   -0.0002
##    400        0.0360             nan     0.1000   -0.0002
##    420        0.0317             nan     0.1000   -0.0002
##    440        0.0283             nan     0.1000   -0.0001
##    460        0.0252             nan     0.1000   -0.0000
##    480        0.0223             nan     0.1000   -0.0000
##    500        0.0199             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2362             nan     0.1000    0.0361
##      2        1.1628             nan     0.1000    0.0325
##      3        1.0930             nan     0.1000    0.0302
##      4        1.0431             nan     0.1000    0.0223
##      5        0.9887             nan     0.1000    0.0215
##      6        0.9467             nan     0.1000    0.0164
##      7        0.9116             nan     0.1000    0.0141
##      8        0.8762             nan     0.1000    0.0143
##      9        0.8489             nan     0.1000    0.0124
##     10        0.8211             nan     0.1000    0.0098
##     20        0.6559             nan     0.1000    0.0023
##     40        0.5206             nan     0.1000   -0.0008
##     60        0.4305             nan     0.1000   -0.0007
##     80        0.3595             nan     0.1000   -0.0017
##    100        0.3039             nan     0.1000   -0.0004
##    120        0.2624             nan     0.1000   -0.0014
##    140        0.2222             nan     0.1000   -0.0007
##    160        0.1925             nan     0.1000   -0.0005
##    180        0.1675             nan     0.1000   -0.0010
##    200        0.1462             nan     0.1000   -0.0004
##    220        0.1265             nan     0.1000   -0.0000
##    240        0.1107             nan     0.1000   -0.0005
##    260        0.0966             nan     0.1000   -0.0004
##    280        0.0851             nan     0.1000   -0.0004
##    300        0.0743             nan     0.1000   -0.0002
##    320        0.0654             nan     0.1000   -0.0002
##    340        0.0582             nan     0.1000   -0.0003
##    360        0.0516             nan     0.1000   -0.0002
##    380        0.0456             nan     0.1000   -0.0002
##    400        0.0409             nan     0.1000   -0.0002
##    420        0.0366             nan     0.1000   -0.0002
##    440        0.0329             nan     0.1000   -0.0001
##    460        0.0297             nan     0.1000   -0.0002
##    480        0.0267             nan     0.1000   -0.0001
##    500        0.0240             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0004
##      4        1.3179             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3162             nan     0.0010    0.0004
##      7        1.3153             nan     0.0010    0.0004
##      8        1.3144             nan     0.0010    0.0004
##      9        1.3136             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3043             nan     0.0010    0.0004
##     40        1.2883             nan     0.0010    0.0003
##     60        1.2726             nan     0.0010    0.0003
##     80        1.2579             nan     0.0010    0.0003
##    100        1.2435             nan     0.0010    0.0003
##    120        1.2295             nan     0.0010    0.0003
##    140        1.2160             nan     0.0010    0.0003
##    160        1.2028             nan     0.0010    0.0003
##    180        1.1899             nan     0.0010    0.0003
##    200        1.1776             nan     0.0010    0.0003
##    220        1.1658             nan     0.0010    0.0003
##    240        1.1540             nan     0.0010    0.0002
##    260        1.1427             nan     0.0010    0.0002
##    280        1.1320             nan     0.0010    0.0002
##    300        1.1212             nan     0.0010    0.0002
##    320        1.1107             nan     0.0010    0.0002
##    340        1.1008             nan     0.0010    0.0002
##    360        1.0911             nan     0.0010    0.0002
##    380        1.0815             nan     0.0010    0.0002
##    400        1.0724             nan     0.0010    0.0002
##    420        1.0633             nan     0.0010    0.0002
##    440        1.0545             nan     0.0010    0.0002
##    460        1.0458             nan     0.0010    0.0002
##    480        1.0374             nan     0.0010    0.0002
##    500        1.0292             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3162             nan     0.0010    0.0004
##      7        1.3155             nan     0.0010    0.0004
##      8        1.3146             nan     0.0010    0.0003
##      9        1.3138             nan     0.0010    0.0004
##     10        1.3130             nan     0.0010    0.0004
##     20        1.3045             nan     0.0010    0.0004
##     40        1.2885             nan     0.0010    0.0003
##     60        1.2731             nan     0.0010    0.0003
##     80        1.2581             nan     0.0010    0.0003
##    100        1.2437             nan     0.0010    0.0003
##    120        1.2298             nan     0.0010    0.0003
##    140        1.2165             nan     0.0010    0.0003
##    160        1.2033             nan     0.0010    0.0003
##    180        1.1906             nan     0.0010    0.0002
##    200        1.1781             nan     0.0010    0.0003
##    220        1.1662             nan     0.0010    0.0003
##    240        1.1545             nan     0.0010    0.0002
##    260        1.1432             nan     0.0010    0.0003
##    280        1.1323             nan     0.0010    0.0002
##    300        1.1216             nan     0.0010    0.0002
##    320        1.1115             nan     0.0010    0.0002
##    340        1.1014             nan     0.0010    0.0002
##    360        1.0919             nan     0.0010    0.0002
##    380        1.0822             nan     0.0010    0.0002
##    400        1.0729             nan     0.0010    0.0002
##    420        1.0638             nan     0.0010    0.0002
##    440        1.0549             nan     0.0010    0.0002
##    460        1.0466             nan     0.0010    0.0002
##    480        1.0384             nan     0.0010    0.0002
##    500        1.0303             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3162             nan     0.0010    0.0004
##      7        1.3154             nan     0.0010    0.0004
##      8        1.3146             nan     0.0010    0.0004
##      9        1.3138             nan     0.0010    0.0004
##     10        1.3130             nan     0.0010    0.0004
##     20        1.3047             nan     0.0010    0.0004
##     40        1.2890             nan     0.0010    0.0004
##     60        1.2737             nan     0.0010    0.0004
##     80        1.2587             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2305             nan     0.0010    0.0003
##    140        1.2169             nan     0.0010    0.0003
##    160        1.2038             nan     0.0010    0.0003
##    180        1.1912             nan     0.0010    0.0003
##    200        1.1789             nan     0.0010    0.0003
##    220        1.1671             nan     0.0010    0.0003
##    240        1.1554             nan     0.0010    0.0003
##    260        1.1442             nan     0.0010    0.0002
##    280        1.1330             nan     0.0010    0.0003
##    300        1.1225             nan     0.0010    0.0002
##    320        1.1124             nan     0.0010    0.0002
##    340        1.1024             nan     0.0010    0.0002
##    360        1.0927             nan     0.0010    0.0002
##    380        1.0833             nan     0.0010    0.0002
##    400        1.0740             nan     0.0010    0.0002
##    420        1.0652             nan     0.0010    0.0002
##    440        1.0564             nan     0.0010    0.0002
##    460        1.0479             nan     0.0010    0.0002
##    480        1.0397             nan     0.0010    0.0001
##    500        1.0316             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2862             nan     0.0010    0.0003
##     60        1.2697             nan     0.0010    0.0004
##     80        1.2533             nan     0.0010    0.0004
##    100        1.2378             nan     0.0010    0.0003
##    120        1.2228             nan     0.0010    0.0004
##    140        1.2088             nan     0.0010    0.0003
##    160        1.1950             nan     0.0010    0.0003
##    180        1.1815             nan     0.0010    0.0003
##    200        1.1683             nan     0.0010    0.0003
##    220        1.1555             nan     0.0010    0.0002
##    240        1.1433             nan     0.0010    0.0003
##    260        1.1314             nan     0.0010    0.0003
##    280        1.1199             nan     0.0010    0.0002
##    300        1.1087             nan     0.0010    0.0002
##    320        1.0977             nan     0.0010    0.0002
##    340        1.0869             nan     0.0010    0.0002
##    360        1.0766             nan     0.0010    0.0002
##    380        1.0665             nan     0.0010    0.0002
##    400        1.0566             nan     0.0010    0.0002
##    420        1.0473             nan     0.0010    0.0002
##    440        1.0381             nan     0.0010    0.0002
##    460        1.0289             nan     0.0010    0.0002
##    480        1.0204             nan     0.0010    0.0001
##    500        1.0118             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0005
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3169             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3133             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0003
##     60        1.2706             nan     0.0010    0.0004
##     80        1.2547             nan     0.0010    0.0004
##    100        1.2394             nan     0.0010    0.0003
##    120        1.2244             nan     0.0010    0.0003
##    140        1.2098             nan     0.0010    0.0003
##    160        1.1955             nan     0.0010    0.0003
##    180        1.1822             nan     0.0010    0.0003
##    200        1.1693             nan     0.0010    0.0003
##    220        1.1569             nan     0.0010    0.0003
##    240        1.1445             nan     0.0010    0.0002
##    260        1.1328             nan     0.0010    0.0002
##    280        1.1215             nan     0.0010    0.0002
##    300        1.1102             nan     0.0010    0.0003
##    320        1.0993             nan     0.0010    0.0003
##    340        1.0887             nan     0.0010    0.0002
##    360        1.0784             nan     0.0010    0.0002
##    380        1.0685             nan     0.0010    0.0002
##    400        1.0589             nan     0.0010    0.0002
##    420        1.0494             nan     0.0010    0.0002
##    440        1.0403             nan     0.0010    0.0002
##    460        1.0312             nan     0.0010    0.0002
##    480        1.0224             nan     0.0010    0.0002
##    500        1.0139             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3039             nan     0.0010    0.0003
##     40        1.2868             nan     0.0010    0.0003
##     60        1.2704             nan     0.0010    0.0004
##     80        1.2545             nan     0.0010    0.0003
##    100        1.2391             nan     0.0010    0.0004
##    120        1.2242             nan     0.0010    0.0003
##    140        1.2104             nan     0.0010    0.0003
##    160        1.1967             nan     0.0010    0.0003
##    180        1.1833             nan     0.0010    0.0003
##    200        1.1702             nan     0.0010    0.0003
##    220        1.1574             nan     0.0010    0.0003
##    240        1.1453             nan     0.0010    0.0003
##    260        1.1334             nan     0.0010    0.0003
##    280        1.1220             nan     0.0010    0.0002
##    300        1.1108             nan     0.0010    0.0002
##    320        1.1003             nan     0.0010    0.0002
##    340        1.0897             nan     0.0010    0.0002
##    360        1.0795             nan     0.0010    0.0002
##    380        1.0696             nan     0.0010    0.0002
##    400        1.0600             nan     0.0010    0.0001
##    420        1.0505             nan     0.0010    0.0002
##    440        1.0418             nan     0.0010    0.0002
##    460        1.0329             nan     0.0010    0.0002
##    480        1.0242             nan     0.0010    0.0002
##    500        1.0157             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0005
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0003
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3027             nan     0.0010    0.0004
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2508             nan     0.0010    0.0004
##    100        1.2346             nan     0.0010    0.0003
##    120        1.2188             nan     0.0010    0.0004
##    140        1.2040             nan     0.0010    0.0003
##    160        1.1896             nan     0.0010    0.0003
##    180        1.1754             nan     0.0010    0.0003
##    200        1.1620             nan     0.0010    0.0003
##    220        1.1487             nan     0.0010    0.0003
##    240        1.1357             nan     0.0010    0.0003
##    260        1.1232             nan     0.0010    0.0003
##    280        1.1109             nan     0.0010    0.0003
##    300        1.0993             nan     0.0010    0.0002
##    320        1.0879             nan     0.0010    0.0002
##    340        1.0768             nan     0.0010    0.0002
##    360        1.0660             nan     0.0010    0.0002
##    380        1.0553             nan     0.0010    0.0002
##    400        1.0454             nan     0.0010    0.0002
##    420        1.0351             nan     0.0010    0.0002
##    440        1.0257             nan     0.0010    0.0002
##    460        1.0164             nan     0.0010    0.0002
##    480        1.0071             nan     0.0010    0.0002
##    500        0.9981             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0005
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2850             nan     0.0010    0.0004
##     60        1.2679             nan     0.0010    0.0004
##     80        1.2514             nan     0.0010    0.0004
##    100        1.2356             nan     0.0010    0.0003
##    120        1.2204             nan     0.0010    0.0003
##    140        1.2057             nan     0.0010    0.0003
##    160        1.1912             nan     0.0010    0.0003
##    180        1.1773             nan     0.0010    0.0002
##    200        1.1635             nan     0.0010    0.0003
##    220        1.1504             nan     0.0010    0.0003
##    240        1.1381             nan     0.0010    0.0003
##    260        1.1256             nan     0.0010    0.0002
##    280        1.1136             nan     0.0010    0.0003
##    300        1.1018             nan     0.0010    0.0002
##    320        1.0907             nan     0.0010    0.0002
##    340        1.0796             nan     0.0010    0.0002
##    360        1.0688             nan     0.0010    0.0002
##    380        1.0582             nan     0.0010    0.0002
##    400        1.0480             nan     0.0010    0.0002
##    420        1.0380             nan     0.0010    0.0002
##    440        1.0288             nan     0.0010    0.0001
##    460        1.0195             nan     0.0010    0.0002
##    480        1.0103             nan     0.0010    0.0002
##    500        1.0014             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2858             nan     0.0010    0.0004
##     60        1.2689             nan     0.0010    0.0004
##     80        1.2524             nan     0.0010    0.0004
##    100        1.2366             nan     0.0010    0.0004
##    120        1.2217             nan     0.0010    0.0003
##    140        1.2072             nan     0.0010    0.0003
##    160        1.1930             nan     0.0010    0.0003
##    180        1.1790             nan     0.0010    0.0003
##    200        1.1655             nan     0.0010    0.0003
##    220        1.1525             nan     0.0010    0.0003
##    240        1.1398             nan     0.0010    0.0003
##    260        1.1276             nan     0.0010    0.0002
##    280        1.1159             nan     0.0010    0.0003
##    300        1.1042             nan     0.0010    0.0002
##    320        1.0932             nan     0.0010    0.0002
##    340        1.0823             nan     0.0010    0.0002
##    360        1.0716             nan     0.0010    0.0002
##    380        1.0612             nan     0.0010    0.0002
##    400        1.0511             nan     0.0010    0.0002
##    420        1.0413             nan     0.0010    0.0002
##    440        1.0319             nan     0.0010    0.0002
##    460        1.0226             nan     0.0010    0.0002
##    480        1.0135             nan     0.0010    0.0002
##    500        1.0046             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3130             nan     0.0100    0.0041
##      2        1.3038             nan     0.0100    0.0041
##      3        1.2962             nan     0.0100    0.0037
##      4        1.2886             nan     0.0100    0.0038
##      5        1.2808             nan     0.0100    0.0035
##      6        1.2729             nan     0.0100    0.0033
##      7        1.2654             nan     0.0100    0.0036
##      8        1.2583             nan     0.0100    0.0031
##      9        1.2515             nan     0.0100    0.0032
##     10        1.2439             nan     0.0100    0.0030
##     20        1.1763             nan     0.0100    0.0027
##     40        1.0718             nan     0.0100    0.0018
##     60        0.9913             nan     0.0100    0.0016
##     80        0.9285             nan     0.0100    0.0011
##    100        0.8780             nan     0.0100    0.0008
##    120        0.8357             nan     0.0100    0.0006
##    140        0.8009             nan     0.0100    0.0007
##    160        0.7715             nan     0.0100    0.0004
##    180        0.7453             nan     0.0100    0.0000
##    200        0.7212             nan     0.0100    0.0002
##    220        0.7009             nan     0.0100    0.0003
##    240        0.6842             nan     0.0100    0.0000
##    260        0.6690             nan     0.0100    0.0001
##    280        0.6551             nan     0.0100   -0.0000
##    300        0.6419             nan     0.0100   -0.0001
##    320        0.6281             nan     0.0100    0.0000
##    340        0.6169             nan     0.0100    0.0002
##    360        0.6054             nan     0.0100   -0.0000
##    380        0.5947             nan     0.0100   -0.0000
##    400        0.5846             nan     0.0100   -0.0001
##    420        0.5760             nan     0.0100   -0.0002
##    440        0.5679             nan     0.0100   -0.0000
##    460        0.5596             nan     0.0100    0.0001
##    480        0.5514             nan     0.0100   -0.0001
##    500        0.5427             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3126             nan     0.0100    0.0041
##      2        1.3043             nan     0.0100    0.0039
##      3        1.2955             nan     0.0100    0.0042
##      4        1.2882             nan     0.0100    0.0031
##      5        1.2804             nan     0.0100    0.0037
##      6        1.2726             nan     0.0100    0.0035
##      7        1.2653             nan     0.0100    0.0031
##      8        1.2584             nan     0.0100    0.0031
##      9        1.2505             nan     0.0100    0.0037
##     10        1.2425             nan     0.0100    0.0037
##     20        1.1771             nan     0.0100    0.0025
##     40        1.0711             nan     0.0100    0.0020
##     60        0.9923             nan     0.0100    0.0016
##     80        0.9283             nan     0.0100    0.0011
##    100        0.8784             nan     0.0100    0.0009
##    120        0.8380             nan     0.0100    0.0007
##    140        0.8025             nan     0.0100    0.0006
##    160        0.7744             nan     0.0100    0.0005
##    180        0.7507             nan     0.0100    0.0002
##    200        0.7281             nan     0.0100    0.0004
##    220        0.7092             nan     0.0100    0.0003
##    240        0.6921             nan     0.0100   -0.0001
##    260        0.6761             nan     0.0100    0.0001
##    280        0.6629             nan     0.0100    0.0001
##    300        0.6511             nan     0.0100   -0.0001
##    320        0.6389             nan     0.0100    0.0002
##    340        0.6284             nan     0.0100    0.0000
##    360        0.6178             nan     0.0100   -0.0000
##    380        0.6074             nan     0.0100    0.0000
##    400        0.5973             nan     0.0100   -0.0001
##    420        0.5882             nan     0.0100    0.0001
##    440        0.5794             nan     0.0100   -0.0000
##    460        0.5701             nan     0.0100   -0.0002
##    480        0.5614             nan     0.0100   -0.0001
##    500        0.5532             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3129             nan     0.0100    0.0038
##      2        1.3044             nan     0.0100    0.0041
##      3        1.2966             nan     0.0100    0.0037
##      4        1.2888             nan     0.0100    0.0036
##      5        1.2811             nan     0.0100    0.0039
##      6        1.2742             nan     0.0100    0.0033
##      7        1.2660             nan     0.0100    0.0034
##      8        1.2586             nan     0.0100    0.0035
##      9        1.2512             nan     0.0100    0.0032
##     10        1.2438             nan     0.0100    0.0035
##     20        1.1781             nan     0.0100    0.0029
##     40        1.0728             nan     0.0100    0.0020
##     60        0.9935             nan     0.0100    0.0013
##     80        0.9303             nan     0.0100    0.0010
##    100        0.8809             nan     0.0100    0.0007
##    120        0.8390             nan     0.0100    0.0006
##    140        0.8059             nan     0.0100    0.0004
##    160        0.7776             nan     0.0100    0.0003
##    180        0.7527             nan     0.0100    0.0004
##    200        0.7325             nan     0.0100    0.0000
##    220        0.7147             nan     0.0100   -0.0000
##    240        0.6977             nan     0.0100    0.0002
##    260        0.6821             nan     0.0100    0.0002
##    280        0.6685             nan     0.0100   -0.0000
##    300        0.6551             nan     0.0100    0.0001
##    320        0.6427             nan     0.0100   -0.0000
##    340        0.6313             nan     0.0100    0.0001
##    360        0.6206             nan     0.0100    0.0000
##    380        0.6099             nan     0.0100   -0.0000
##    400        0.6006             nan     0.0100   -0.0000
##    420        0.5904             nan     0.0100   -0.0000
##    440        0.5814             nan     0.0100   -0.0000
##    460        0.5724             nan     0.0100    0.0000
##    480        0.5641             nan     0.0100    0.0000
##    500        0.5561             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0043
##      2        1.3030             nan     0.0100    0.0040
##      3        1.2947             nan     0.0100    0.0040
##      4        1.2858             nan     0.0100    0.0040
##      5        1.2779             nan     0.0100    0.0037
##      6        1.2695             nan     0.0100    0.0032
##      7        1.2608             nan     0.0100    0.0037
##      8        1.2532             nan     0.0100    0.0032
##      9        1.2452             nan     0.0100    0.0040
##     10        1.2376             nan     0.0100    0.0032
##     20        1.1686             nan     0.0100    0.0026
##     40        1.0588             nan     0.0100    0.0022
##     60        0.9743             nan     0.0100    0.0015
##     80        0.9081             nan     0.0100    0.0011
##    100        0.8547             nan     0.0100    0.0008
##    120        0.8125             nan     0.0100    0.0005
##    140        0.7767             nan     0.0100    0.0005
##    160        0.7465             nan     0.0100    0.0005
##    180        0.7190             nan     0.0100    0.0003
##    200        0.6944             nan     0.0100    0.0002
##    220        0.6737             nan     0.0100    0.0000
##    240        0.6536             nan     0.0100    0.0002
##    260        0.6349             nan     0.0100    0.0001
##    280        0.6188             nan     0.0100    0.0001
##    300        0.6038             nan     0.0100   -0.0000
##    320        0.5890             nan     0.0100    0.0001
##    340        0.5763             nan     0.0100    0.0001
##    360        0.5633             nan     0.0100    0.0000
##    380        0.5514             nan     0.0100   -0.0000
##    400        0.5397             nan     0.0100    0.0002
##    420        0.5289             nan     0.0100   -0.0000
##    440        0.5183             nan     0.0100    0.0000
##    460        0.5091             nan     0.0100   -0.0000
##    480        0.4999             nan     0.0100    0.0000
##    500        0.4911             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0043
##      2        1.3025             nan     0.0100    0.0034
##      3        1.2936             nan     0.0100    0.0039
##      4        1.2851             nan     0.0100    0.0038
##      5        1.2774             nan     0.0100    0.0037
##      6        1.2693             nan     0.0100    0.0034
##      7        1.2623             nan     0.0100    0.0035
##      8        1.2540             nan     0.0100    0.0036
##      9        1.2467             nan     0.0100    0.0028
##     10        1.2392             nan     0.0100    0.0035
##     20        1.1690             nan     0.0100    0.0024
##     40        1.0565             nan     0.0100    0.0022
##     60        0.9728             nan     0.0100    0.0017
##     80        0.9069             nan     0.0100    0.0012
##    100        0.8544             nan     0.0100    0.0011
##    120        0.8105             nan     0.0100    0.0008
##    140        0.7735             nan     0.0100    0.0006
##    160        0.7444             nan     0.0100    0.0003
##    180        0.7180             nan     0.0100    0.0004
##    200        0.6958             nan     0.0100    0.0001
##    220        0.6746             nan     0.0100    0.0004
##    240        0.6551             nan     0.0100    0.0002
##    260        0.6369             nan     0.0100    0.0002
##    280        0.6207             nan     0.0100    0.0001
##    300        0.6065             nan     0.0100    0.0001
##    320        0.5940             nan     0.0100    0.0000
##    340        0.5823             nan     0.0100   -0.0001
##    360        0.5699             nan     0.0100    0.0000
##    380        0.5584             nan     0.0100   -0.0003
##    400        0.5466             nan     0.0100   -0.0000
##    420        0.5361             nan     0.0100   -0.0001
##    440        0.5250             nan     0.0100    0.0001
##    460        0.5156             nan     0.0100   -0.0002
##    480        0.5059             nan     0.0100   -0.0002
##    500        0.4974             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3129             nan     0.0100    0.0038
##      2        1.3045             nan     0.0100    0.0036
##      3        1.2956             nan     0.0100    0.0038
##      4        1.2872             nan     0.0100    0.0041
##      5        1.2791             nan     0.0100    0.0032
##      6        1.2708             nan     0.0100    0.0041
##      7        1.2630             nan     0.0100    0.0037
##      8        1.2552             nan     0.0100    0.0036
##      9        1.2474             nan     0.0100    0.0035
##     10        1.2402             nan     0.0100    0.0033
##     20        1.1727             nan     0.0100    0.0030
##     40        1.0626             nan     0.0100    0.0019
##     60        0.9786             nan     0.0100    0.0016
##     80        0.9124             nan     0.0100    0.0013
##    100        0.8596             nan     0.0100    0.0009
##    120        0.8179             nan     0.0100    0.0006
##    140        0.7824             nan     0.0100    0.0008
##    160        0.7506             nan     0.0100    0.0002
##    180        0.7251             nan     0.0100    0.0003
##    200        0.7030             nan     0.0100    0.0002
##    220        0.6840             nan     0.0100    0.0001
##    240        0.6653             nan     0.0100   -0.0000
##    260        0.6498             nan     0.0100   -0.0001
##    280        0.6339             nan     0.0100   -0.0001
##    300        0.6187             nan     0.0100    0.0001
##    320        0.6049             nan     0.0100    0.0001
##    340        0.5924             nan     0.0100    0.0001
##    360        0.5802             nan     0.0100   -0.0000
##    380        0.5674             nan     0.0100    0.0001
##    400        0.5566             nan     0.0100   -0.0001
##    420        0.5455             nan     0.0100    0.0001
##    440        0.5358             nan     0.0100   -0.0001
##    460        0.5257             nan     0.0100   -0.0001
##    480        0.5155             nan     0.0100   -0.0001
##    500        0.5063             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0040
##      2        1.3026             nan     0.0100    0.0040
##      3        1.2929             nan     0.0100    0.0039
##      4        1.2840             nan     0.0100    0.0039
##      5        1.2758             nan     0.0100    0.0037
##      6        1.2671             nan     0.0100    0.0039
##      7        1.2582             nan     0.0100    0.0040
##      8        1.2501             nan     0.0100    0.0037
##      9        1.2416             nan     0.0100    0.0040
##     10        1.2337             nan     0.0100    0.0037
##     20        1.1617             nan     0.0100    0.0026
##     40        1.0445             nan     0.0100    0.0021
##     60        0.9548             nan     0.0100    0.0017
##     80        0.8844             nan     0.0100    0.0012
##    100        0.8296             nan     0.0100    0.0009
##    120        0.7843             nan     0.0100    0.0007
##    140        0.7457             nan     0.0100    0.0007
##    160        0.7120             nan     0.0100    0.0002
##    180        0.6846             nan     0.0100    0.0003
##    200        0.6603             nan     0.0100    0.0003
##    220        0.6382             nan     0.0100    0.0003
##    240        0.6174             nan     0.0100    0.0002
##    260        0.5983             nan     0.0100   -0.0001
##    280        0.5816             nan     0.0100    0.0000
##    300        0.5649             nan     0.0100    0.0000
##    320        0.5506             nan     0.0100    0.0002
##    340        0.5360             nan     0.0100    0.0001
##    360        0.5237             nan     0.0100   -0.0002
##    380        0.5117             nan     0.0100   -0.0000
##    400        0.4998             nan     0.0100    0.0001
##    420        0.4877             nan     0.0100    0.0002
##    440        0.4772             nan     0.0100    0.0001
##    460        0.4660             nan     0.0100   -0.0001
##    480        0.4559             nan     0.0100   -0.0001
##    500        0.4466             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0042
##      2        1.3027             nan     0.0100    0.0042
##      3        1.2935             nan     0.0100    0.0042
##      4        1.2843             nan     0.0100    0.0040
##      5        1.2758             nan     0.0100    0.0038
##      6        1.2676             nan     0.0100    0.0038
##      7        1.2590             nan     0.0100    0.0038
##      8        1.2512             nan     0.0100    0.0035
##      9        1.2435             nan     0.0100    0.0036
##     10        1.2353             nan     0.0100    0.0034
##     20        1.1646             nan     0.0100    0.0029
##     40        1.0499             nan     0.0100    0.0020
##     60        0.9592             nan     0.0100    0.0019
##     80        0.8904             nan     0.0100    0.0011
##    100        0.8343             nan     0.0100    0.0009
##    120        0.7878             nan     0.0100    0.0006
##    140        0.7505             nan     0.0100    0.0004
##    160        0.7169             nan     0.0100    0.0004
##    180        0.6871             nan     0.0100    0.0003
##    200        0.6625             nan     0.0100    0.0002
##    220        0.6410             nan     0.0100    0.0000
##    240        0.6207             nan     0.0100    0.0001
##    260        0.6012             nan     0.0100    0.0001
##    280        0.5838             nan     0.0100    0.0000
##    300        0.5675             nan     0.0100    0.0002
##    320        0.5525             nan     0.0100    0.0000
##    340        0.5373             nan     0.0100    0.0000
##    360        0.5255             nan     0.0100   -0.0000
##    380        0.5129             nan     0.0100    0.0000
##    400        0.5010             nan     0.0100    0.0000
##    420        0.4889             nan     0.0100   -0.0001
##    440        0.4781             nan     0.0100   -0.0002
##    460        0.4683             nan     0.0100   -0.0000
##    480        0.4587             nan     0.0100   -0.0000
##    500        0.4488             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0041
##      2        1.3029             nan     0.0100    0.0039
##      3        1.2944             nan     0.0100    0.0042
##      4        1.2860             nan     0.0100    0.0038
##      5        1.2781             nan     0.0100    0.0035
##      6        1.2698             nan     0.0100    0.0038
##      7        1.2618             nan     0.0100    0.0033
##      8        1.2531             nan     0.0100    0.0038
##      9        1.2456             nan     0.0100    0.0035
##     10        1.2373             nan     0.0100    0.0037
##     20        1.1674             nan     0.0100    0.0030
##     40        1.0538             nan     0.0100    0.0023
##     60        0.9659             nan     0.0100    0.0013
##     80        0.8987             nan     0.0100    0.0011
##    100        0.8445             nan     0.0100    0.0009
##    120        0.7980             nan     0.0100    0.0007
##    140        0.7615             nan     0.0100    0.0005
##    160        0.7286             nan     0.0100    0.0003
##    180        0.7021             nan     0.0100    0.0002
##    200        0.6769             nan     0.0100    0.0002
##    220        0.6554             nan     0.0100    0.0002
##    240        0.6359             nan     0.0100   -0.0000
##    260        0.6197             nan     0.0100   -0.0000
##    280        0.6027             nan     0.0100    0.0000
##    300        0.5876             nan     0.0100   -0.0000
##    320        0.5724             nan     0.0100    0.0001
##    340        0.5576             nan     0.0100   -0.0000
##    360        0.5443             nan     0.0100   -0.0001
##    380        0.5325             nan     0.0100   -0.0001
##    400        0.5207             nan     0.0100   -0.0002
##    420        0.5083             nan     0.0100   -0.0001
##    440        0.4977             nan     0.0100   -0.0000
##    460        0.4871             nan     0.0100    0.0000
##    480        0.4768             nan     0.0100   -0.0001
##    500        0.4661             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2443             nan     0.1000    0.0323
##      2        1.1750             nan     0.1000    0.0320
##      3        1.1131             nan     0.1000    0.0266
##      4        1.0713             nan     0.1000    0.0193
##      5        1.0296             nan     0.1000    0.0192
##      6        0.9890             nan     0.1000    0.0195
##      7        0.9552             nan     0.1000    0.0153
##      8        0.9293             nan     0.1000    0.0092
##      9        0.9025             nan     0.1000    0.0100
##     10        0.8796             nan     0.1000    0.0091
##     20        0.7259             nan     0.1000    0.0023
##     40        0.5947             nan     0.1000   -0.0019
##     60        0.5178             nan     0.1000    0.0001
##     80        0.4572             nan     0.1000   -0.0007
##    100        0.3960             nan     0.1000   -0.0011
##    120        0.3477             nan     0.1000   -0.0003
##    140        0.3100             nan     0.1000    0.0000
##    160        0.2745             nan     0.1000   -0.0004
##    180        0.2487             nan     0.1000   -0.0005
##    200        0.2226             nan     0.1000   -0.0001
##    220        0.2020             nan     0.1000   -0.0006
##    240        0.1840             nan     0.1000   -0.0003
##    260        0.1668             nan     0.1000   -0.0004
##    280        0.1514             nan     0.1000   -0.0009
##    300        0.1379             nan     0.1000   -0.0000
##    320        0.1267             nan     0.1000   -0.0001
##    340        0.1159             nan     0.1000   -0.0004
##    360        0.1077             nan     0.1000   -0.0001
##    380        0.0989             nan     0.1000   -0.0004
##    400        0.0913             nan     0.1000   -0.0001
##    420        0.0841             nan     0.1000   -0.0003
##    440        0.0772             nan     0.1000   -0.0002
##    460        0.0715             nan     0.1000   -0.0002
##    480        0.0662             nan     0.1000   -0.0002
##    500        0.0609             nan     0.1000   -0.0004
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2395             nan     0.1000    0.0399
##      2        1.1757             nan     0.1000    0.0288
##      3        1.1171             nan     0.1000    0.0281
##      4        1.0722             nan     0.1000    0.0170
##      5        1.0259             nan     0.1000    0.0180
##      6        0.9897             nan     0.1000    0.0160
##      7        0.9586             nan     0.1000    0.0125
##      8        0.9313             nan     0.1000    0.0105
##      9        0.9054             nan     0.1000    0.0099
##     10        0.8820             nan     0.1000    0.0101
##     20        0.7385             nan     0.1000    0.0043
##     40        0.6000             nan     0.1000    0.0006
##     60        0.5178             nan     0.1000   -0.0007
##     80        0.4477             nan     0.1000   -0.0006
##    100        0.3967             nan     0.1000   -0.0001
##    120        0.3512             nan     0.1000   -0.0007
##    140        0.3130             nan     0.1000    0.0005
##    160        0.2814             nan     0.1000   -0.0004
##    180        0.2520             nan     0.1000   -0.0008
##    200        0.2297             nan     0.1000   -0.0003
##    220        0.2086             nan     0.1000   -0.0004
##    240        0.1879             nan     0.1000   -0.0010
##    260        0.1721             nan     0.1000   -0.0001
##    280        0.1577             nan     0.1000   -0.0006
##    300        0.1420             nan     0.1000    0.0002
##    320        0.1299             nan     0.1000   -0.0005
##    340        0.1189             nan     0.1000   -0.0005
##    360        0.1092             nan     0.1000   -0.0002
##    380        0.1006             nan     0.1000   -0.0003
##    400        0.0925             nan     0.1000   -0.0004
##    420        0.0856             nan     0.1000   -0.0002
##    440        0.0787             nan     0.1000   -0.0002
##    460        0.0735             nan     0.1000   -0.0001
##    480        0.0674             nan     0.1000   -0.0003
##    500        0.0625             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2435             nan     0.1000    0.0367
##      2        1.1748             nan     0.1000    0.0324
##      3        1.1148             nan     0.1000    0.0269
##      4        1.0647             nan     0.1000    0.0212
##      5        1.0249             nan     0.1000    0.0189
##      6        0.9871             nan     0.1000    0.0142
##      7        0.9521             nan     0.1000    0.0132
##      8        0.9261             nan     0.1000    0.0117
##      9        0.9022             nan     0.1000    0.0087
##     10        0.8790             nan     0.1000    0.0086
##     20        0.7405             nan     0.1000    0.0023
##     40        0.6139             nan     0.1000    0.0003
##     60        0.5259             nan     0.1000   -0.0002
##     80        0.4583             nan     0.1000   -0.0035
##    100        0.4020             nan     0.1000   -0.0013
##    120        0.3589             nan     0.1000   -0.0002
##    140        0.3239             nan     0.1000   -0.0005
##    160        0.2899             nan     0.1000   -0.0003
##    180        0.2637             nan     0.1000   -0.0015
##    200        0.2421             nan     0.1000   -0.0002
##    220        0.2192             nan     0.1000   -0.0005
##    240        0.2013             nan     0.1000   -0.0003
##    260        0.1845             nan     0.1000   -0.0015
##    280        0.1700             nan     0.1000   -0.0008
##    300        0.1557             nan     0.1000   -0.0005
##    320        0.1421             nan     0.1000   -0.0001
##    340        0.1305             nan     0.1000   -0.0003
##    360        0.1197             nan     0.1000   -0.0002
##    380        0.1101             nan     0.1000   -0.0004
##    400        0.1017             nan     0.1000   -0.0002
##    420        0.0931             nan     0.1000   -0.0001
##    440        0.0861             nan     0.1000   -0.0003
##    460        0.0804             nan     0.1000   -0.0001
##    480        0.0748             nan     0.1000   -0.0004
##    500        0.0696             nan     0.1000   -0.0004
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2329             nan     0.1000    0.0411
##      2        1.1648             nan     0.1000    0.0341
##      3        1.1033             nan     0.1000    0.0247
##      4        1.0479             nan     0.1000    0.0253
##      5        1.0049             nan     0.1000    0.0194
##      6        0.9622             nan     0.1000    0.0169
##      7        0.9194             nan     0.1000    0.0167
##      8        0.8908             nan     0.1000    0.0116
##      9        0.8624             nan     0.1000    0.0108
##     10        0.8373             nan     0.1000    0.0097
##     20        0.6821             nan     0.1000    0.0033
##     40        0.5384             nan     0.1000   -0.0002
##     60        0.4549             nan     0.1000   -0.0012
##     80        0.3852             nan     0.1000   -0.0000
##    100        0.3334             nan     0.1000   -0.0009
##    120        0.2869             nan     0.1000    0.0003
##    140        0.2481             nan     0.1000   -0.0005
##    160        0.2219             nan     0.1000   -0.0002
##    180        0.1955             nan     0.1000   -0.0006
##    200        0.1733             nan     0.1000   -0.0005
##    220        0.1526             nan     0.1000   -0.0006
##    240        0.1360             nan     0.1000   -0.0006
##    260        0.1209             nan     0.1000   -0.0003
##    280        0.1078             nan     0.1000    0.0001
##    300        0.0966             nan     0.1000   -0.0002
##    320        0.0868             nan     0.1000   -0.0001
##    340        0.0778             nan     0.1000    0.0000
##    360        0.0704             nan     0.1000   -0.0003
##    380        0.0637             nan     0.1000   -0.0001
##    400        0.0572             nan     0.1000   -0.0003
##    420        0.0517             nan     0.1000   -0.0001
##    440        0.0468             nan     0.1000   -0.0002
##    460        0.0423             nan     0.1000   -0.0001
##    480        0.0382             nan     0.1000   -0.0001
##    500        0.0345             nan     0.1000    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2344             nan     0.1000    0.0384
##      2        1.1680             nan     0.1000    0.0286
##      3        1.1099             nan     0.1000    0.0265
##      4        1.0536             nan     0.1000    0.0247
##      5        1.0125             nan     0.1000    0.0151
##      6        0.9712             nan     0.1000    0.0152
##      7        0.9370             nan     0.1000    0.0130
##      8        0.9120             nan     0.1000    0.0089
##      9        0.8827             nan     0.1000    0.0116
##     10        0.8608             nan     0.1000    0.0068
##     20        0.6968             nan     0.1000    0.0042
##     40        0.5513             nan     0.1000   -0.0010
##     60        0.4513             nan     0.1000   -0.0012
##     80        0.3860             nan     0.1000   -0.0006
##    100        0.3337             nan     0.1000    0.0001
##    120        0.2919             nan     0.1000   -0.0006
##    140        0.2551             nan     0.1000   -0.0002
##    160        0.2250             nan     0.1000   -0.0006
##    180        0.1968             nan     0.1000    0.0001
##    200        0.1733             nan     0.1000   -0.0005
##    220        0.1529             nan     0.1000   -0.0002
##    240        0.1364             nan     0.1000   -0.0006
##    260        0.1216             nan     0.1000   -0.0007
##    280        0.1079             nan     0.1000   -0.0006
##    300        0.0961             nan     0.1000   -0.0004
##    320        0.0871             nan     0.1000   -0.0003
##    340        0.0784             nan     0.1000   -0.0003
##    360        0.0706             nan     0.1000   -0.0002
##    380        0.0630             nan     0.1000   -0.0002
##    400        0.0567             nan     0.1000   -0.0002
##    420        0.0512             nan     0.1000   -0.0003
##    440        0.0462             nan     0.1000   -0.0002
##    460        0.0414             nan     0.1000   -0.0001
##    480        0.0376             nan     0.1000   -0.0002
##    500        0.0342             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2399             nan     0.1000    0.0397
##      2        1.1721             nan     0.1000    0.0324
##      3        1.1092             nan     0.1000    0.0276
##      4        1.0638             nan     0.1000    0.0190
##      5        1.0175             nan     0.1000    0.0200
##      6        0.9738             nan     0.1000    0.0183
##      7        0.9409             nan     0.1000    0.0139
##      8        0.9147             nan     0.1000    0.0108
##      9        0.8812             nan     0.1000    0.0120
##     10        0.8610             nan     0.1000    0.0057
##     20        0.7164             nan     0.1000    0.0003
##     40        0.5645             nan     0.1000    0.0010
##     60        0.4733             nan     0.1000   -0.0006
##     80        0.4006             nan     0.1000   -0.0012
##    100        0.3473             nan     0.1000   -0.0009
##    120        0.2997             nan     0.1000   -0.0005
##    140        0.2608             nan     0.1000   -0.0007
##    160        0.2296             nan     0.1000   -0.0007
##    180        0.2020             nan     0.1000   -0.0009
##    200        0.1791             nan     0.1000   -0.0003
##    220        0.1607             nan     0.1000   -0.0006
##    240        0.1468             nan     0.1000   -0.0007
##    260        0.1314             nan     0.1000   -0.0009
##    280        0.1169             nan     0.1000   -0.0006
##    300        0.1056             nan     0.1000   -0.0006
##    320        0.0950             nan     0.1000   -0.0002
##    340        0.0866             nan     0.1000   -0.0001
##    360        0.0776             nan     0.1000   -0.0003
##    380        0.0696             nan     0.1000   -0.0002
##    400        0.0621             nan     0.1000   -0.0001
##    420        0.0566             nan     0.1000   -0.0003
##    440        0.0512             nan     0.1000   -0.0002
##    460        0.0469             nan     0.1000   -0.0001
##    480        0.0428             nan     0.1000   -0.0002
##    500        0.0386             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2363             nan     0.1000    0.0413
##      2        1.1617             nan     0.1000    0.0335
##      3        1.0931             nan     0.1000    0.0300
##      4        1.0370             nan     0.1000    0.0244
##      5        0.9901             nan     0.1000    0.0185
##      6        0.9502             nan     0.1000    0.0160
##      7        0.9167             nan     0.1000    0.0121
##      8        0.8878             nan     0.1000    0.0121
##      9        0.8572             nan     0.1000    0.0100
##     10        0.8331             nan     0.1000    0.0091
##     20        0.6635             nan     0.1000    0.0023
##     40        0.4998             nan     0.1000   -0.0000
##     60        0.3983             nan     0.1000    0.0005
##     80        0.3293             nan     0.1000   -0.0002
##    100        0.2756             nan     0.1000   -0.0012
##    120        0.2282             nan     0.1000    0.0004
##    140        0.1934             nan     0.1000   -0.0001
##    160        0.1650             nan     0.1000   -0.0001
##    180        0.1417             nan     0.1000   -0.0004
##    200        0.1230             nan     0.1000   -0.0000
##    220        0.1077             nan     0.1000   -0.0004
##    240        0.0924             nan     0.1000   -0.0001
##    260        0.0798             nan     0.1000   -0.0003
##    280        0.0705             nan     0.1000   -0.0003
##    300        0.0615             nan     0.1000   -0.0001
##    320        0.0538             nan     0.1000   -0.0001
##    340        0.0476             nan     0.1000   -0.0001
##    360        0.0416             nan     0.1000   -0.0000
##    380        0.0371             nan     0.1000   -0.0002
##    400        0.0322             nan     0.1000   -0.0001
##    420        0.0286             nan     0.1000   -0.0001
##    440        0.0251             nan     0.1000   -0.0000
##    460        0.0223             nan     0.1000   -0.0001
##    480        0.0199             nan     0.1000   -0.0001
##    500        0.0177             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2317             nan     0.1000    0.0419
##      2        1.1647             nan     0.1000    0.0313
##      3        1.0931             nan     0.1000    0.0303
##      4        1.0383             nan     0.1000    0.0237
##      5        0.9897             nan     0.1000    0.0187
##      6        0.9503             nan     0.1000    0.0172
##      7        0.9139             nan     0.1000    0.0148
##      8        0.8850             nan     0.1000    0.0128
##      9        0.8597             nan     0.1000    0.0100
##     10        0.8364             nan     0.1000    0.0082
##     20        0.6719             nan     0.1000    0.0023
##     40        0.5142             nan     0.1000   -0.0006
##     60        0.4163             nan     0.1000   -0.0017
##     80        0.3377             nan     0.1000   -0.0001
##    100        0.2817             nan     0.1000   -0.0006
##    120        0.2382             nan     0.1000   -0.0005
##    140        0.2049             nan     0.1000   -0.0006
##    160        0.1765             nan     0.1000   -0.0006
##    180        0.1523             nan     0.1000    0.0000
##    200        0.1327             nan     0.1000   -0.0001
##    220        0.1147             nan     0.1000   -0.0004
##    240        0.0991             nan     0.1000   -0.0001
##    260        0.0877             nan     0.1000   -0.0003
##    280        0.0766             nan     0.1000   -0.0003
##    300        0.0675             nan     0.1000   -0.0002
##    320        0.0590             nan     0.1000   -0.0001
##    340        0.0518             nan     0.1000   -0.0001
##    360        0.0453             nan     0.1000   -0.0002
##    380        0.0400             nan     0.1000   -0.0001
##    400        0.0353             nan     0.1000   -0.0001
##    420        0.0310             nan     0.1000   -0.0001
##    440        0.0274             nan     0.1000   -0.0002
##    460        0.0246             nan     0.1000   -0.0001
##    480        0.0216             nan     0.1000   -0.0000
##    500        0.0189             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2290             nan     0.1000    0.0409
##      2        1.1600             nan     0.1000    0.0307
##      3        1.1024             nan     0.1000    0.0234
##      4        1.0550             nan     0.1000    0.0209
##      5        1.0126             nan     0.1000    0.0209
##      6        0.9738             nan     0.1000    0.0187
##      7        0.9351             nan     0.1000    0.0138
##      8        0.8949             nan     0.1000    0.0145
##      9        0.8679             nan     0.1000    0.0086
##     10        0.8426             nan     0.1000    0.0082
##     20        0.6752             nan     0.1000    0.0016
##     40        0.5241             nan     0.1000    0.0014
##     60        0.4196             nan     0.1000   -0.0003
##     80        0.3414             nan     0.1000   -0.0000
##    100        0.2917             nan     0.1000   -0.0008
##    120        0.2469             nan     0.1000   -0.0007
##    140        0.2133             nan     0.1000   -0.0006
##    160        0.1847             nan     0.1000   -0.0005
##    180        0.1596             nan     0.1000   -0.0011
##    200        0.1392             nan     0.1000   -0.0003
##    220        0.1205             nan     0.1000   -0.0006
##    240        0.1047             nan     0.1000   -0.0009
##    260        0.0905             nan     0.1000   -0.0003
##    280        0.0793             nan     0.1000   -0.0003
##    300        0.0696             nan     0.1000   -0.0002
##    320        0.0610             nan     0.1000   -0.0000
##    340        0.0539             nan     0.1000   -0.0002
##    360        0.0475             nan     0.1000   -0.0002
##    380        0.0421             nan     0.1000   -0.0001
##    400        0.0370             nan     0.1000   -0.0003
##    420        0.0324             nan     0.1000   -0.0002
##    440        0.0284             nan     0.1000   -0.0001
##    460        0.0255             nan     0.1000   -0.0001
##    480        0.0223             nan     0.1000   -0.0000
##    500        0.0198             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3031             nan     0.0010    0.0003
##     40        1.2861             nan     0.0010    0.0004
##     60        1.2701             nan     0.0010    0.0003
##     80        1.2543             nan     0.0010    0.0004
##    100        1.2385             nan     0.0010    0.0004
##    120        1.2240             nan     0.0010    0.0003
##    140        1.2094             nan     0.0010    0.0003
##    160        1.1958             nan     0.0010    0.0003
##    180        1.1824             nan     0.0010    0.0003
##    200        1.1694             nan     0.0010    0.0003
##    220        1.1568             nan     0.0010    0.0003
##    240        1.1442             nan     0.0010    0.0003
##    260        1.1322             nan     0.0010    0.0002
##    280        1.1201             nan     0.0010    0.0003
##    300        1.1089             nan     0.0010    0.0002
##    320        1.0980             nan     0.0010    0.0002
##    340        1.0876             nan     0.0010    0.0002
##    360        1.0773             nan     0.0010    0.0002
##    380        1.0673             nan     0.0010    0.0002
##    400        1.0577             nan     0.0010    0.0002
##    420        1.0481             nan     0.0010    0.0002
##    440        1.0392             nan     0.0010    0.0002
##    460        1.0300             nan     0.0010    0.0002
##    480        1.0216             nan     0.0010    0.0002
##    500        1.0130             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0005
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0005
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3118             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0003
##     40        1.2859             nan     0.0010    0.0004
##     60        1.2694             nan     0.0010    0.0003
##     80        1.2536             nan     0.0010    0.0003
##    100        1.2386             nan     0.0010    0.0003
##    120        1.2241             nan     0.0010    0.0003
##    140        1.2095             nan     0.0010    0.0003
##    160        1.1954             nan     0.0010    0.0003
##    180        1.1821             nan     0.0010    0.0002
##    200        1.1688             nan     0.0010    0.0003
##    220        1.1564             nan     0.0010    0.0003
##    240        1.1443             nan     0.0010    0.0002
##    260        1.1325             nan     0.0010    0.0002
##    280        1.1211             nan     0.0010    0.0003
##    300        1.1099             nan     0.0010    0.0002
##    320        1.0989             nan     0.0010    0.0002
##    340        1.0883             nan     0.0010    0.0002
##    360        1.0782             nan     0.0010    0.0002
##    380        1.0681             nan     0.0010    0.0002
##    400        1.0587             nan     0.0010    0.0002
##    420        1.0491             nan     0.0010    0.0002
##    440        1.0401             nan     0.0010    0.0002
##    460        1.0311             nan     0.0010    0.0002
##    480        1.0224             nan     0.0010    0.0002
##    500        1.0139             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0005
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3034             nan     0.0010    0.0004
##     40        1.2864             nan     0.0010    0.0004
##     60        1.2706             nan     0.0010    0.0004
##     80        1.2549             nan     0.0010    0.0003
##    100        1.2400             nan     0.0010    0.0003
##    120        1.2253             nan     0.0010    0.0003
##    140        1.2111             nan     0.0010    0.0003
##    160        1.1971             nan     0.0010    0.0003
##    180        1.1835             nan     0.0010    0.0003
##    200        1.1706             nan     0.0010    0.0003
##    220        1.1581             nan     0.0010    0.0003
##    240        1.1460             nan     0.0010    0.0003
##    260        1.1341             nan     0.0010    0.0003
##    280        1.1228             nan     0.0010    0.0003
##    300        1.1116             nan     0.0010    0.0003
##    320        1.1009             nan     0.0010    0.0003
##    340        1.0903             nan     0.0010    0.0002
##    360        1.0799             nan     0.0010    0.0002
##    380        1.0702             nan     0.0010    0.0002
##    400        1.0605             nan     0.0010    0.0002
##    420        1.0509             nan     0.0010    0.0002
##    440        1.0418             nan     0.0010    0.0002
##    460        1.0326             nan     0.0010    0.0002
##    480        1.0240             nan     0.0010    0.0002
##    500        1.0155             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0005
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0005
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2836             nan     0.0010    0.0004
##     60        1.2663             nan     0.0010    0.0003
##     80        1.2492             nan     0.0010    0.0004
##    100        1.2330             nan     0.0010    0.0003
##    120        1.2173             nan     0.0010    0.0003
##    140        1.2022             nan     0.0010    0.0003
##    160        1.1875             nan     0.0010    0.0003
##    180        1.1732             nan     0.0010    0.0003
##    200        1.1594             nan     0.0010    0.0003
##    220        1.1456             nan     0.0010    0.0003
##    240        1.1328             nan     0.0010    0.0003
##    260        1.1203             nan     0.0010    0.0002
##    280        1.1084             nan     0.0010    0.0003
##    300        1.0965             nan     0.0010    0.0002
##    320        1.0850             nan     0.0010    0.0003
##    340        1.0737             nan     0.0010    0.0002
##    360        1.0628             nan     0.0010    0.0002
##    380        1.0521             nan     0.0010    0.0002
##    400        1.0420             nan     0.0010    0.0002
##    420        1.0320             nan     0.0010    0.0002
##    440        1.0221             nan     0.0010    0.0002
##    460        1.0129             nan     0.0010    0.0002
##    480        1.0038             nan     0.0010    0.0002
##    500        0.9946             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0005
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3021             nan     0.0010    0.0004
##     40        1.2842             nan     0.0010    0.0004
##     60        1.2672             nan     0.0010    0.0004
##     80        1.2503             nan     0.0010    0.0003
##    100        1.2339             nan     0.0010    0.0003
##    120        1.2182             nan     0.0010    0.0004
##    140        1.2032             nan     0.0010    0.0003
##    160        1.1885             nan     0.0010    0.0003
##    180        1.1744             nan     0.0010    0.0003
##    200        1.1607             nan     0.0010    0.0003
##    220        1.1473             nan     0.0010    0.0003
##    240        1.1344             nan     0.0010    0.0003
##    260        1.1218             nan     0.0010    0.0003
##    280        1.1094             nan     0.0010    0.0003
##    300        1.0976             nan     0.0010    0.0003
##    320        1.0860             nan     0.0010    0.0002
##    340        1.0750             nan     0.0010    0.0002
##    360        1.0642             nan     0.0010    0.0003
##    380        1.0539             nan     0.0010    0.0002
##    400        1.0440             nan     0.0010    0.0002
##    420        1.0341             nan     0.0010    0.0002
##    440        1.0245             nan     0.0010    0.0002
##    460        1.0152             nan     0.0010    0.0002
##    480        1.0061             nan     0.0010    0.0002
##    500        0.9972             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2843             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2514             nan     0.0010    0.0004
##    100        1.2357             nan     0.0010    0.0004
##    120        1.2201             nan     0.0010    0.0003
##    140        1.2050             nan     0.0010    0.0004
##    160        1.1906             nan     0.0010    0.0003
##    180        1.1768             nan     0.0010    0.0003
##    200        1.1634             nan     0.0010    0.0003
##    220        1.1501             nan     0.0010    0.0003
##    240        1.1373             nan     0.0010    0.0003
##    260        1.1247             nan     0.0010    0.0002
##    280        1.1126             nan     0.0010    0.0002
##    300        1.1011             nan     0.0010    0.0003
##    320        1.0896             nan     0.0010    0.0003
##    340        1.0786             nan     0.0010    0.0002
##    360        1.0677             nan     0.0010    0.0003
##    380        1.0571             nan     0.0010    0.0002
##    400        1.0471             nan     0.0010    0.0002
##    420        1.0373             nan     0.0010    0.0002
##    440        1.0276             nan     0.0010    0.0002
##    460        1.0181             nan     0.0010    0.0002
##    480        1.0088             nan     0.0010    0.0002
##    500        0.9999             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0005
##      4        1.3168             nan     0.0010    0.0005
##      5        1.3158             nan     0.0010    0.0005
##      6        1.3148             nan     0.0010    0.0005
##      7        1.3138             nan     0.0010    0.0005
##      8        1.3128             nan     0.0010    0.0005
##      9        1.3118             nan     0.0010    0.0005
##     10        1.3108             nan     0.0010    0.0005
##     20        1.3011             nan     0.0010    0.0004
##     40        1.2821             nan     0.0010    0.0004
##     60        1.2638             nan     0.0010    0.0004
##     80        1.2461             nan     0.0010    0.0004
##    100        1.2291             nan     0.0010    0.0003
##    120        1.2129             nan     0.0010    0.0004
##    140        1.1969             nan     0.0010    0.0004
##    160        1.1817             nan     0.0010    0.0003
##    180        1.1669             nan     0.0010    0.0003
##    200        1.1529             nan     0.0010    0.0003
##    220        1.1386             nan     0.0010    0.0003
##    240        1.1252             nan     0.0010    0.0003
##    260        1.1121             nan     0.0010    0.0003
##    280        1.0995             nan     0.0010    0.0002
##    300        1.0873             nan     0.0010    0.0003
##    320        1.0755             nan     0.0010    0.0003
##    340        1.0637             nan     0.0010    0.0003
##    360        1.0520             nan     0.0010    0.0002
##    380        1.0410             nan     0.0010    0.0002
##    400        1.0304             nan     0.0010    0.0002
##    420        1.0200             nan     0.0010    0.0002
##    440        1.0100             nan     0.0010    0.0002
##    460        1.0001             nan     0.0010    0.0002
##    480        0.9909             nan     0.0010    0.0002
##    500        0.9818             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3166             nan     0.0010    0.0005
##      5        1.3156             nan     0.0010    0.0005
##      6        1.3146             nan     0.0010    0.0004
##      7        1.3137             nan     0.0010    0.0004
##      8        1.3126             nan     0.0010    0.0004
##      9        1.3117             nan     0.0010    0.0005
##     10        1.3107             nan     0.0010    0.0004
##     20        1.3010             nan     0.0010    0.0004
##     40        1.2822             nan     0.0010    0.0004
##     60        1.2642             nan     0.0010    0.0004
##     80        1.2468             nan     0.0010    0.0004
##    100        1.2302             nan     0.0010    0.0004
##    120        1.2142             nan     0.0010    0.0004
##    140        1.1982             nan     0.0010    0.0004
##    160        1.1828             nan     0.0010    0.0004
##    180        1.1680             nan     0.0010    0.0003
##    200        1.1540             nan     0.0010    0.0003
##    220        1.1403             nan     0.0010    0.0003
##    240        1.1269             nan     0.0010    0.0003
##    260        1.1140             nan     0.0010    0.0003
##    280        1.1015             nan     0.0010    0.0003
##    300        1.0890             nan     0.0010    0.0003
##    320        1.0770             nan     0.0010    0.0003
##    340        1.0653             nan     0.0010    0.0002
##    360        1.0542             nan     0.0010    0.0002
##    380        1.0435             nan     0.0010    0.0002
##    400        1.0327             nan     0.0010    0.0002
##    420        1.0223             nan     0.0010    0.0002
##    440        1.0123             nan     0.0010    0.0002
##    460        1.0027             nan     0.0010    0.0002
##    480        0.9935             nan     0.0010    0.0002
##    500        0.9843             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0005
##      9        1.3118             nan     0.0010    0.0005
##     10        1.3108             nan     0.0010    0.0005
##     20        1.3011             nan     0.0010    0.0005
##     40        1.2829             nan     0.0010    0.0004
##     60        1.2654             nan     0.0010    0.0003
##     80        1.2480             nan     0.0010    0.0004
##    100        1.2314             nan     0.0010    0.0004
##    120        1.2153             nan     0.0010    0.0004
##    140        1.1995             nan     0.0010    0.0003
##    160        1.1846             nan     0.0010    0.0004
##    180        1.1699             nan     0.0010    0.0003
##    200        1.1558             nan     0.0010    0.0003
##    220        1.1426             nan     0.0010    0.0003
##    240        1.1294             nan     0.0010    0.0003
##    260        1.1167             nan     0.0010    0.0003
##    280        1.1041             nan     0.0010    0.0003
##    300        1.0917             nan     0.0010    0.0002
##    320        1.0800             nan     0.0010    0.0002
##    340        1.0686             nan     0.0010    0.0002
##    360        1.0574             nan     0.0010    0.0003
##    380        1.0467             nan     0.0010    0.0002
##    400        1.0361             nan     0.0010    0.0002
##    420        1.0262             nan     0.0010    0.0002
##    440        1.0165             nan     0.0010    0.0002
##    460        1.0070             nan     0.0010    0.0002
##    480        0.9976             nan     0.0010    0.0002
##    500        0.9885             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0040
##      2        1.3030             nan     0.0100    0.0043
##      3        1.2948             nan     0.0100    0.0035
##      4        1.2859             nan     0.0100    0.0042
##      5        1.2767             nan     0.0100    0.0040
##      6        1.2686             nan     0.0100    0.0040
##      7        1.2608             nan     0.0100    0.0038
##      8        1.2528             nan     0.0100    0.0038
##      9        1.2449             nan     0.0100    0.0035
##     10        1.2371             nan     0.0100    0.0036
##     20        1.1694             nan     0.0100    0.0024
##     40        1.0590             nan     0.0100    0.0021
##     60        0.9728             nan     0.0100    0.0016
##     80        0.9090             nan     0.0100    0.0014
##    100        0.8549             nan     0.0100    0.0009
##    120        0.8114             nan     0.0100    0.0006
##    140        0.7757             nan     0.0100    0.0006
##    160        0.7465             nan     0.0100    0.0004
##    180        0.7201             nan     0.0100    0.0004
##    200        0.6977             nan     0.0100    0.0003
##    220        0.6792             nan     0.0100    0.0002
##    240        0.6630             nan     0.0100    0.0001
##    260        0.6483             nan     0.0100   -0.0000
##    280        0.6337             nan     0.0100   -0.0001
##    300        0.6206             nan     0.0100    0.0001
##    320        0.6083             nan     0.0100    0.0002
##    340        0.5984             nan     0.0100    0.0001
##    360        0.5877             nan     0.0100   -0.0000
##    380        0.5763             nan     0.0100    0.0000
##    400        0.5671             nan     0.0100   -0.0000
##    420        0.5579             nan     0.0100   -0.0000
##    440        0.5495             nan     0.0100   -0.0001
##    460        0.5416             nan     0.0100   -0.0000
##    480        0.5335             nan     0.0100   -0.0001
##    500        0.5257             nan     0.0100   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3108             nan     0.0100    0.0046
##      2        1.3017             nan     0.0100    0.0040
##      3        1.2934             nan     0.0100    0.0038
##      4        1.2854             nan     0.0100    0.0038
##      5        1.2779             nan     0.0100    0.0029
##      6        1.2694             nan     0.0100    0.0037
##      7        1.2625             nan     0.0100    0.0032
##      8        1.2542             nan     0.0100    0.0039
##      9        1.2470             nan     0.0100    0.0034
##     10        1.2391             nan     0.0100    0.0036
##     20        1.1702             nan     0.0100    0.0027
##     40        1.0574             nan     0.0100    0.0025
##     60        0.9739             nan     0.0100    0.0013
##     80        0.9094             nan     0.0100    0.0012
##    100        0.8563             nan     0.0100    0.0007
##    120        0.8140             nan     0.0100    0.0006
##    140        0.7782             nan     0.0100    0.0005
##    160        0.7490             nan     0.0100    0.0004
##    180        0.7233             nan     0.0100    0.0003
##    200        0.7024             nan     0.0100    0.0001
##    220        0.6819             nan     0.0100    0.0002
##    240        0.6654             nan     0.0100    0.0000
##    260        0.6505             nan     0.0100   -0.0000
##    280        0.6355             nan     0.0100    0.0001
##    300        0.6230             nan     0.0100    0.0001
##    320        0.6116             nan     0.0100    0.0000
##    340        0.6003             nan     0.0100   -0.0000
##    360        0.5906             nan     0.0100    0.0000
##    380        0.5809             nan     0.0100    0.0001
##    400        0.5708             nan     0.0100    0.0001
##    420        0.5618             nan     0.0100   -0.0001
##    440        0.5534             nan     0.0100   -0.0000
##    460        0.5449             nan     0.0100   -0.0001
##    480        0.5371             nan     0.0100    0.0000
##    500        0.5301             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0044
##      2        1.3027             nan     0.0100    0.0040
##      3        1.2947             nan     0.0100    0.0034
##      4        1.2861             nan     0.0100    0.0039
##      5        1.2781             nan     0.0100    0.0038
##      6        1.2703             nan     0.0100    0.0032
##      7        1.2627             nan     0.0100    0.0035
##      8        1.2543             nan     0.0100    0.0037
##      9        1.2474             nan     0.0100    0.0032
##     10        1.2399             nan     0.0100    0.0035
##     20        1.1713             nan     0.0100    0.0028
##     40        1.0600             nan     0.0100    0.0018
##     60        0.9744             nan     0.0100    0.0016
##     80        0.9093             nan     0.0100    0.0012
##    100        0.8570             nan     0.0100    0.0005
##    120        0.8142             nan     0.0100    0.0008
##    140        0.7795             nan     0.0100    0.0007
##    160        0.7500             nan     0.0100    0.0005
##    180        0.7254             nan     0.0100    0.0004
##    200        0.7038             nan     0.0100    0.0003
##    220        0.6857             nan     0.0100    0.0002
##    240        0.6684             nan     0.0100    0.0000
##    260        0.6541             nan     0.0100    0.0001
##    280        0.6412             nan     0.0100   -0.0000
##    300        0.6291             nan     0.0100    0.0000
##    320        0.6175             nan     0.0100   -0.0000
##    340        0.6070             nan     0.0100    0.0000
##    360        0.5960             nan     0.0100    0.0000
##    380        0.5871             nan     0.0100    0.0001
##    400        0.5782             nan     0.0100    0.0000
##    420        0.5692             nan     0.0100   -0.0002
##    440        0.5604             nan     0.0100   -0.0001
##    460        0.5526             nan     0.0100   -0.0001
##    480        0.5446             nan     0.0100   -0.0000
##    500        0.5373             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3110             nan     0.0100    0.0043
##      2        1.3015             nan     0.0100    0.0046
##      3        1.2926             nan     0.0100    0.0040
##      4        1.2839             nan     0.0100    0.0040
##      5        1.2755             nan     0.0100    0.0036
##      6        1.2665             nan     0.0100    0.0042
##      7        1.2580             nan     0.0100    0.0038
##      8        1.2493             nan     0.0100    0.0039
##      9        1.2408             nan     0.0100    0.0038
##     10        1.2337             nan     0.0100    0.0032
##     20        1.1601             nan     0.0100    0.0031
##     40        1.0446             nan     0.0100    0.0023
##     60        0.9559             nan     0.0100    0.0017
##     80        0.8868             nan     0.0100    0.0012
##    100        0.8304             nan     0.0100    0.0010
##    120        0.7861             nan     0.0100    0.0009
##    140        0.7481             nan     0.0100    0.0005
##    160        0.7169             nan     0.0100    0.0005
##    180        0.6900             nan     0.0100    0.0003
##    200        0.6667             nan     0.0100    0.0004
##    220        0.6460             nan     0.0100    0.0001
##    240        0.6267             nan     0.0100    0.0001
##    260        0.6095             nan     0.0100    0.0001
##    280        0.5940             nan     0.0100    0.0000
##    300        0.5795             nan     0.0100    0.0001
##    320        0.5667             nan     0.0100    0.0001
##    340        0.5545             nan     0.0100   -0.0002
##    360        0.5424             nan     0.0100   -0.0001
##    380        0.5318             nan     0.0100   -0.0001
##    400        0.5210             nan     0.0100    0.0001
##    420        0.5109             nan     0.0100   -0.0000
##    440        0.5018             nan     0.0100    0.0001
##    460        0.4930             nan     0.0100    0.0001
##    480        0.4835             nan     0.0100   -0.0000
##    500        0.4757             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3106             nan     0.0100    0.0046
##      2        1.3017             nan     0.0100    0.0039
##      3        1.2933             nan     0.0100    0.0036
##      4        1.2842             nan     0.0100    0.0039
##      5        1.2756             nan     0.0100    0.0042
##      6        1.2680             nan     0.0100    0.0035
##      7        1.2588             nan     0.0100    0.0037
##      8        1.2507             nan     0.0100    0.0033
##      9        1.2419             nan     0.0100    0.0037
##     10        1.2338             nan     0.0100    0.0041
##     20        1.1604             nan     0.0100    0.0029
##     40        1.0453             nan     0.0100    0.0021
##     60        0.9556             nan     0.0100    0.0015
##     80        0.8870             nan     0.0100    0.0011
##    100        0.8312             nan     0.0100    0.0008
##    120        0.7864             nan     0.0100    0.0008
##    140        0.7494             nan     0.0100    0.0004
##    160        0.7178             nan     0.0100    0.0002
##    180        0.6922             nan     0.0100    0.0003
##    200        0.6694             nan     0.0100    0.0002
##    220        0.6498             nan     0.0100   -0.0000
##    240        0.6315             nan     0.0100    0.0002
##    260        0.6152             nan     0.0100    0.0002
##    280        0.6005             nan     0.0100    0.0001
##    300        0.5870             nan     0.0100    0.0000
##    320        0.5732             nan     0.0100   -0.0000
##    340        0.5609             nan     0.0100   -0.0000
##    360        0.5495             nan     0.0100   -0.0001
##    380        0.5387             nan     0.0100   -0.0001
##    400        0.5287             nan     0.0100   -0.0001
##    420        0.5180             nan     0.0100    0.0000
##    440        0.5084             nan     0.0100    0.0001
##    460        0.4989             nan     0.0100   -0.0001
##    480        0.4901             nan     0.0100    0.0001
##    500        0.4812             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0047
##      2        1.3012             nan     0.0100    0.0049
##      3        1.2917             nan     0.0100    0.0047
##      4        1.2831             nan     0.0100    0.0039
##      5        1.2736             nan     0.0100    0.0045
##      6        1.2655             nan     0.0100    0.0038
##      7        1.2574             nan     0.0100    0.0040
##      8        1.2493             nan     0.0100    0.0036
##      9        1.2407             nan     0.0100    0.0035
##     10        1.2330             nan     0.0100    0.0032
##     20        1.1589             nan     0.0100    0.0031
##     40        1.0464             nan     0.0100    0.0025
##     60        0.9586             nan     0.0100    0.0016
##     80        0.8905             nan     0.0100    0.0013
##    100        0.8357             nan     0.0100    0.0007
##    120        0.7913             nan     0.0100    0.0006
##    140        0.7533             nan     0.0100    0.0006
##    160        0.7235             nan     0.0100    0.0006
##    180        0.6978             nan     0.0100    0.0001
##    200        0.6742             nan     0.0100    0.0001
##    220        0.6544             nan     0.0100    0.0002
##    240        0.6369             nan     0.0100    0.0002
##    260        0.6202             nan     0.0100    0.0001
##    280        0.6060             nan     0.0100    0.0002
##    300        0.5914             nan     0.0100    0.0000
##    320        0.5781             nan     0.0100   -0.0001
##    340        0.5668             nan     0.0100   -0.0000
##    360        0.5556             nan     0.0100   -0.0001
##    380        0.5443             nan     0.0100   -0.0001
##    400        0.5351             nan     0.0100   -0.0000
##    420        0.5251             nan     0.0100   -0.0001
##    440        0.5153             nan     0.0100    0.0000
##    460        0.5060             nan     0.0100   -0.0002
##    480        0.4974             nan     0.0100   -0.0001
##    500        0.4891             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3105             nan     0.0100    0.0044
##      2        1.3017             nan     0.0100    0.0039
##      3        1.2927             nan     0.0100    0.0037
##      4        1.2836             nan     0.0100    0.0045
##      5        1.2739             nan     0.0100    0.0043
##      6        1.2650             nan     0.0100    0.0041
##      7        1.2566             nan     0.0100    0.0040
##      8        1.2474             nan     0.0100    0.0040
##      9        1.2387             nan     0.0100    0.0042
##     10        1.2297             nan     0.0100    0.0043
##     20        1.1510             nan     0.0100    0.0032
##     40        1.0302             nan     0.0100    0.0019
##     60        0.9386             nan     0.0100    0.0016
##     80        0.8680             nan     0.0100    0.0011
##    100        0.8104             nan     0.0100    0.0011
##    120        0.7640             nan     0.0100    0.0008
##    140        0.7255             nan     0.0100    0.0007
##    160        0.6915             nan     0.0100    0.0003
##    180        0.6626             nan     0.0100    0.0005
##    200        0.6373             nan     0.0100    0.0003
##    220        0.6144             nan     0.0100    0.0000
##    240        0.5954             nan     0.0100    0.0000
##    260        0.5768             nan     0.0100    0.0001
##    280        0.5601             nan     0.0100    0.0000
##    300        0.5455             nan     0.0100    0.0001
##    320        0.5313             nan     0.0100    0.0000
##    340        0.5171             nan     0.0100    0.0001
##    360        0.5049             nan     0.0100    0.0002
##    380        0.4926             nan     0.0100   -0.0001
##    400        0.4809             nan     0.0100    0.0000
##    420        0.4695             nan     0.0100    0.0001
##    440        0.4592             nan     0.0100   -0.0001
##    460        0.4496             nan     0.0100   -0.0002
##    480        0.4402             nan     0.0100   -0.0000
##    500        0.4303             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0041
##      2        1.3013             nan     0.0100    0.0045
##      3        1.2921             nan     0.0100    0.0037
##      4        1.2831             nan     0.0100    0.0043
##      5        1.2740             nan     0.0100    0.0042
##      6        1.2652             nan     0.0100    0.0042
##      7        1.2565             nan     0.0100    0.0040
##      8        1.2474             nan     0.0100    0.0041
##      9        1.2391             nan     0.0100    0.0039
##     10        1.2311             nan     0.0100    0.0035
##     20        1.1535             nan     0.0100    0.0033
##     40        1.0334             nan     0.0100    0.0024
##     60        0.9422             nan     0.0100    0.0020
##     80        0.8699             nan     0.0100    0.0011
##    100        0.8136             nan     0.0100    0.0007
##    120        0.7653             nan     0.0100    0.0004
##    140        0.7273             nan     0.0100    0.0006
##    160        0.6955             nan     0.0100    0.0004
##    180        0.6672             nan     0.0100    0.0004
##    200        0.6414             nan     0.0100    0.0002
##    220        0.6202             nan     0.0100    0.0002
##    240        0.6009             nan     0.0100    0.0003
##    260        0.5825             nan     0.0100   -0.0000
##    280        0.5664             nan     0.0100    0.0001
##    300        0.5521             nan     0.0100   -0.0000
##    320        0.5383             nan     0.0100    0.0001
##    340        0.5252             nan     0.0100    0.0000
##    360        0.5128             nan     0.0100    0.0001
##    380        0.5014             nan     0.0100   -0.0001
##    400        0.4894             nan     0.0100   -0.0000
##    420        0.4786             nan     0.0100    0.0000
##    440        0.4678             nan     0.0100    0.0001
##    460        0.4574             nan     0.0100   -0.0001
##    480        0.4479             nan     0.0100   -0.0001
##    500        0.4383             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0043
##      2        1.3015             nan     0.0100    0.0045
##      3        1.2916             nan     0.0100    0.0045
##      4        1.2815             nan     0.0100    0.0044
##      5        1.2724             nan     0.0100    0.0038
##      6        1.2634             nan     0.0100    0.0037
##      7        1.2552             nan     0.0100    0.0039
##      8        1.2466             nan     0.0100    0.0040
##      9        1.2385             nan     0.0100    0.0037
##     10        1.2303             nan     0.0100    0.0039
##     20        1.1551             nan     0.0100    0.0034
##     40        1.0345             nan     0.0100    0.0024
##     60        0.9460             nan     0.0100    0.0018
##     80        0.8750             nan     0.0100    0.0013
##    100        0.8180             nan     0.0100    0.0012
##    120        0.7737             nan     0.0100    0.0007
##    140        0.7361             nan     0.0100    0.0003
##    160        0.7041             nan     0.0100    0.0004
##    180        0.6763             nan     0.0100    0.0003
##    200        0.6520             nan     0.0100    0.0001
##    220        0.6301             nan     0.0100   -0.0000
##    240        0.6115             nan     0.0100    0.0002
##    260        0.5947             nan     0.0100    0.0001
##    280        0.5786             nan     0.0100    0.0002
##    300        0.5634             nan     0.0100    0.0000
##    320        0.5501             nan     0.0100   -0.0001
##    340        0.5362             nan     0.0100   -0.0002
##    360        0.5239             nan     0.0100    0.0000
##    380        0.5113             nan     0.0100    0.0001
##    400        0.5002             nan     0.0100   -0.0001
##    420        0.4894             nan     0.0100   -0.0000
##    440        0.4796             nan     0.0100   -0.0002
##    460        0.4697             nan     0.0100    0.0002
##    480        0.4598             nan     0.0100   -0.0000
##    500        0.4507             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2341             nan     0.1000    0.0405
##      2        1.1692             nan     0.1000    0.0304
##      3        1.1095             nan     0.1000    0.0311
##      4        1.0607             nan     0.1000    0.0175
##      5        1.0082             nan     0.1000    0.0223
##      6        0.9687             nan     0.1000    0.0167
##      7        0.9350             nan     0.1000    0.0138
##      8        0.9013             nan     0.1000    0.0127
##      9        0.8744             nan     0.1000    0.0105
##     10        0.8507             nan     0.1000    0.0086
##     20        0.7061             nan     0.1000    0.0030
##     40        0.5755             nan     0.1000   -0.0006
##     60        0.4942             nan     0.1000   -0.0003
##     80        0.4354             nan     0.1000    0.0004
##    100        0.3833             nan     0.1000   -0.0005
##    120        0.3425             nan     0.1000   -0.0017
##    140        0.3034             nan     0.1000   -0.0007
##    160        0.2735             nan     0.1000   -0.0001
##    180        0.2482             nan     0.1000   -0.0006
##    200        0.2198             nan     0.1000   -0.0010
##    220        0.2005             nan     0.1000   -0.0002
##    240        0.1834             nan     0.1000   -0.0003
##    260        0.1671             nan     0.1000   -0.0004
##    280        0.1527             nan     0.1000   -0.0004
##    300        0.1407             nan     0.1000   -0.0004
##    320        0.1284             nan     0.1000   -0.0003
##    340        0.1187             nan     0.1000   -0.0003
##    360        0.1092             nan     0.1000   -0.0001
##    380        0.1000             nan     0.1000   -0.0001
##    400        0.0927             nan     0.1000   -0.0002
##    420        0.0849             nan     0.1000   -0.0002
##    440        0.0790             nan     0.1000   -0.0001
##    460        0.0735             nan     0.1000   -0.0001
##    480        0.0682             nan     0.1000   -0.0002
##    500        0.0638             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2290             nan     0.1000    0.0389
##      2        1.1719             nan     0.1000    0.0227
##      3        1.1175             nan     0.1000    0.0235
##      4        1.0626             nan     0.1000    0.0244
##      5        1.0158             nan     0.1000    0.0202
##      6        0.9778             nan     0.1000    0.0158
##      7        0.9431             nan     0.1000    0.0131
##      8        0.9037             nan     0.1000    0.0158
##      9        0.8779             nan     0.1000    0.0104
##     10        0.8516             nan     0.1000    0.0097
##     20        0.7113             nan     0.1000   -0.0004
##     40        0.5795             nan     0.1000    0.0010
##     60        0.5005             nan     0.1000   -0.0006
##     80        0.4415             nan     0.1000   -0.0021
##    100        0.3931             nan     0.1000   -0.0020
##    120        0.3505             nan     0.1000   -0.0005
##    140        0.3131             nan     0.1000    0.0003
##    160        0.2790             nan     0.1000   -0.0004
##    180        0.2525             nan     0.1000   -0.0005
##    200        0.2289             nan     0.1000   -0.0007
##    220        0.2082             nan     0.1000   -0.0007
##    240        0.1897             nan     0.1000   -0.0005
##    260        0.1733             nan     0.1000   -0.0002
##    280        0.1582             nan     0.1000   -0.0006
##    300        0.1436             nan     0.1000   -0.0007
##    320        0.1326             nan     0.1000   -0.0004
##    340        0.1219             nan     0.1000   -0.0003
##    360        0.1116             nan     0.1000   -0.0002
##    380        0.1028             nan     0.1000   -0.0000
##    400        0.0958             nan     0.1000   -0.0003
##    420        0.0881             nan     0.1000   -0.0003
##    440        0.0811             nan     0.1000   -0.0004
##    460        0.0753             nan     0.1000   -0.0001
##    480        0.0697             nan     0.1000   -0.0000
##    500        0.0646             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2321             nan     0.1000    0.0392
##      2        1.1564             nan     0.1000    0.0367
##      3        1.0986             nan     0.1000    0.0277
##      4        1.0515             nan     0.1000    0.0206
##      5        1.0080             nan     0.1000    0.0194
##      6        0.9677             nan     0.1000    0.0188
##      7        0.9358             nan     0.1000    0.0124
##      8        0.9005             nan     0.1000    0.0142
##      9        0.8730             nan     0.1000    0.0105
##     10        0.8466             nan     0.1000    0.0114
##     20        0.6934             nan     0.1000    0.0023
##     40        0.5712             nan     0.1000   -0.0003
##     60        0.5006             nan     0.1000   -0.0016
##     80        0.4457             nan     0.1000   -0.0019
##    100        0.3930             nan     0.1000   -0.0009
##    120        0.3486             nan     0.1000   -0.0014
##    140        0.3184             nan     0.1000   -0.0011
##    160        0.2861             nan     0.1000   -0.0008
##    180        0.2644             nan     0.1000   -0.0009
##    200        0.2395             nan     0.1000   -0.0009
##    220        0.2199             nan     0.1000   -0.0018
##    240        0.2010             nan     0.1000   -0.0002
##    260        0.1854             nan     0.1000   -0.0009
##    280        0.1703             nan     0.1000   -0.0010
##    300        0.1575             nan     0.1000   -0.0006
##    320        0.1449             nan     0.1000   -0.0009
##    340        0.1314             nan     0.1000   -0.0002
##    360        0.1212             nan     0.1000   -0.0003
##    380        0.1121             nan     0.1000   -0.0007
##    400        0.1043             nan     0.1000   -0.0006
##    420        0.0970             nan     0.1000   -0.0001
##    440        0.0899             nan     0.1000   -0.0002
##    460        0.0830             nan     0.1000   -0.0004
##    480        0.0774             nan     0.1000   -0.0003
##    500        0.0710             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2290             nan     0.1000    0.0393
##      2        1.1512             nan     0.1000    0.0391
##      3        1.0878             nan     0.1000    0.0272
##      4        1.0339             nan     0.1000    0.0247
##      5        0.9880             nan     0.1000    0.0185
##      6        0.9479             nan     0.1000    0.0148
##      7        0.9043             nan     0.1000    0.0184
##      8        0.8661             nan     0.1000    0.0149
##      9        0.8369             nan     0.1000    0.0110
##     10        0.8136             nan     0.1000    0.0088
##     20        0.6606             nan     0.1000    0.0040
##     40        0.5210             nan     0.1000   -0.0003
##     60        0.4408             nan     0.1000   -0.0002
##     80        0.3806             nan     0.1000   -0.0004
##    100        0.3337             nan     0.1000   -0.0017
##    120        0.2903             nan     0.1000   -0.0002
##    140        0.2539             nan     0.1000   -0.0007
##    160        0.2208             nan     0.1000   -0.0008
##    180        0.1963             nan     0.1000   -0.0004
##    200        0.1752             nan     0.1000   -0.0010
##    220        0.1555             nan     0.1000   -0.0006
##    240        0.1380             nan     0.1000   -0.0002
##    260        0.1238             nan     0.1000   -0.0001
##    280        0.1087             nan     0.1000   -0.0003
##    300        0.0979             nan     0.1000   -0.0005
##    320        0.0887             nan     0.1000   -0.0003
##    340        0.0800             nan     0.1000   -0.0002
##    360        0.0717             nan     0.1000   -0.0000
##    380        0.0642             nan     0.1000   -0.0001
##    400        0.0577             nan     0.1000    0.0000
##    420        0.0526             nan     0.1000   -0.0001
##    440        0.0472             nan     0.1000   -0.0001
##    460        0.0432             nan     0.1000   -0.0003
##    480        0.0392             nan     0.1000   -0.0001
##    500        0.0360             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2282             nan     0.1000    0.0431
##      2        1.1537             nan     0.1000    0.0291
##      3        1.0933             nan     0.1000    0.0303
##      4        1.0353             nan     0.1000    0.0240
##      5        0.9882             nan     0.1000    0.0202
##      6        0.9473             nan     0.1000    0.0159
##      7        0.9139             nan     0.1000    0.0142
##      8        0.8803             nan     0.1000    0.0154
##      9        0.8508             nan     0.1000    0.0108
##     10        0.8249             nan     0.1000    0.0092
##     20        0.6708             nan     0.1000    0.0035
##     40        0.5247             nan     0.1000   -0.0006
##     60        0.4458             nan     0.1000   -0.0003
##     80        0.3874             nan     0.1000   -0.0009
##    100        0.3330             nan     0.1000   -0.0010
##    120        0.2922             nan     0.1000   -0.0009
##    140        0.2581             nan     0.1000   -0.0005
##    160        0.2277             nan     0.1000   -0.0010
##    180        0.2007             nan     0.1000   -0.0007
##    200        0.1746             nan     0.1000   -0.0002
##    220        0.1557             nan     0.1000   -0.0007
##    240        0.1387             nan     0.1000   -0.0005
##    260        0.1238             nan     0.1000   -0.0007
##    280        0.1095             nan     0.1000   -0.0003
##    300        0.0991             nan     0.1000   -0.0006
##    320        0.0884             nan     0.1000   -0.0002
##    340        0.0800             nan     0.1000   -0.0003
##    360        0.0725             nan     0.1000   -0.0002
##    380        0.0654             nan     0.1000   -0.0002
##    400        0.0591             nan     0.1000   -0.0001
##    420        0.0539             nan     0.1000   -0.0001
##    440        0.0488             nan     0.1000   -0.0001
##    460        0.0440             nan     0.1000   -0.0002
##    480        0.0405             nan     0.1000   -0.0001
##    500        0.0369             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2327             nan     0.1000    0.0386
##      2        1.1570             nan     0.1000    0.0321
##      3        1.0957             nan     0.1000    0.0276
##      4        1.0404             nan     0.1000    0.0227
##      5        0.9950             nan     0.1000    0.0198
##      6        0.9520             nan     0.1000    0.0199
##      7        0.9171             nan     0.1000    0.0135
##      8        0.8858             nan     0.1000    0.0131
##      9        0.8586             nan     0.1000    0.0106
##     10        0.8344             nan     0.1000    0.0094
##     20        0.6821             nan     0.1000    0.0020
##     40        0.5429             nan     0.1000   -0.0007
##     60        0.4664             nan     0.1000    0.0003
##     80        0.3961             nan     0.1000   -0.0016
##    100        0.3438             nan     0.1000   -0.0008
##    120        0.3024             nan     0.1000   -0.0005
##    140        0.2675             nan     0.1000   -0.0007
##    160        0.2390             nan     0.1000   -0.0010
##    180        0.2135             nan     0.1000   -0.0006
##    200        0.1894             nan     0.1000   -0.0008
##    220        0.1719             nan     0.1000   -0.0010
##    240        0.1527             nan     0.1000   -0.0008
##    260        0.1374             nan     0.1000   -0.0008
##    280        0.1231             nan     0.1000   -0.0004
##    300        0.1117             nan     0.1000   -0.0002
##    320        0.0993             nan     0.1000   -0.0001
##    340        0.0904             nan     0.1000   -0.0003
##    360        0.0817             nan     0.1000   -0.0003
##    380        0.0749             nan     0.1000   -0.0003
##    400        0.0676             nan     0.1000   -0.0001
##    420        0.0616             nan     0.1000   -0.0003
##    440        0.0565             nan     0.1000   -0.0002
##    460        0.0516             nan     0.1000   -0.0002
##    480        0.0467             nan     0.1000   -0.0001
##    500        0.0419             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2353             nan     0.1000    0.0346
##      2        1.1571             nan     0.1000    0.0344
##      3        1.0977             nan     0.1000    0.0257
##      4        1.0409             nan     0.1000    0.0270
##      5        0.9903             nan     0.1000    0.0203
##      6        0.9452             nan     0.1000    0.0175
##      7        0.9018             nan     0.1000    0.0158
##      8        0.8686             nan     0.1000    0.0123
##      9        0.8355             nan     0.1000    0.0128
##     10        0.8071             nan     0.1000    0.0108
##     20        0.6300             nan     0.1000    0.0037
##     40        0.4813             nan     0.1000   -0.0010
##     60        0.3973             nan     0.1000   -0.0007
##     80        0.3374             nan     0.1000   -0.0013
##    100        0.2818             nan     0.1000   -0.0005
##    120        0.2374             nan     0.1000   -0.0002
##    140        0.1971             nan     0.1000   -0.0005
##    160        0.1686             nan     0.1000    0.0000
##    180        0.1428             nan     0.1000   -0.0006
##    200        0.1242             nan     0.1000   -0.0003
##    220        0.1089             nan     0.1000   -0.0002
##    240        0.0947             nan     0.1000   -0.0002
##    260        0.0838             nan     0.1000   -0.0002
##    280        0.0731             nan     0.1000   -0.0003
##    300        0.0650             nan     0.1000   -0.0002
##    320        0.0571             nan     0.1000    0.0000
##    340        0.0503             nan     0.1000   -0.0002
##    360        0.0450             nan     0.1000   -0.0001
##    380        0.0403             nan     0.1000   -0.0001
##    400        0.0362             nan     0.1000   -0.0001
##    420        0.0322             nan     0.1000   -0.0001
##    440        0.0288             nan     0.1000   -0.0002
##    460        0.0254             nan     0.1000   -0.0001
##    480        0.0227             nan     0.1000   -0.0001
##    500        0.0205             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2295             nan     0.1000    0.0449
##      2        1.1563             nan     0.1000    0.0319
##      3        1.0953             nan     0.1000    0.0245
##      4        1.0412             nan     0.1000    0.0236
##      5        0.9897             nan     0.1000    0.0210
##      6        0.9450             nan     0.1000    0.0196
##      7        0.9100             nan     0.1000    0.0147
##      8        0.8732             nan     0.1000    0.0138
##      9        0.8414             nan     0.1000    0.0129
##     10        0.8153             nan     0.1000    0.0096
##     20        0.6412             nan     0.1000    0.0020
##     40        0.4965             nan     0.1000   -0.0014
##     60        0.4056             nan     0.1000   -0.0001
##     80        0.3344             nan     0.1000   -0.0013
##    100        0.2833             nan     0.1000   -0.0001
##    120        0.2386             nan     0.1000   -0.0010
##    140        0.2043             nan     0.1000   -0.0011
##    160        0.1778             nan     0.1000   -0.0007
##    180        0.1549             nan     0.1000   -0.0007
##    200        0.1336             nan     0.1000   -0.0001
##    220        0.1186             nan     0.1000   -0.0005
##    240        0.1041             nan     0.1000   -0.0004
##    260        0.0907             nan     0.1000   -0.0003
##    280        0.0805             nan     0.1000   -0.0001
##    300        0.0713             nan     0.1000   -0.0002
##    320        0.0623             nan     0.1000   -0.0001
##    340        0.0557             nan     0.1000   -0.0001
##    360        0.0492             nan     0.1000   -0.0002
##    380        0.0444             nan     0.1000   -0.0001
##    400        0.0391             nan     0.1000   -0.0001
##    420        0.0349             nan     0.1000   -0.0002
##    440        0.0309             nan     0.1000   -0.0001
##    460        0.0272             nan     0.1000   -0.0001
##    480        0.0240             nan     0.1000   -0.0001
##    500        0.0213             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2328             nan     0.1000    0.0380
##      2        1.1560             nan     0.1000    0.0391
##      3        1.0902             nan     0.1000    0.0264
##      4        1.0351             nan     0.1000    0.0220
##      5        0.9853             nan     0.1000    0.0200
##      6        0.9378             nan     0.1000    0.0182
##      7        0.9012             nan     0.1000    0.0139
##      8        0.8689             nan     0.1000    0.0116
##      9        0.8385             nan     0.1000    0.0118
##     10        0.8131             nan     0.1000    0.0103
##     20        0.6441             nan     0.1000    0.0029
##     40        0.5068             nan     0.1000   -0.0000
##     60        0.4147             nan     0.1000   -0.0011
##     80        0.3472             nan     0.1000   -0.0018
##    100        0.2881             nan     0.1000   -0.0006
##    120        0.2494             nan     0.1000   -0.0002
##    140        0.2124             nan     0.1000   -0.0002
##    160        0.1819             nan     0.1000   -0.0002
##    180        0.1594             nan     0.1000   -0.0003
##    200        0.1390             nan     0.1000   -0.0006
##    220        0.1212             nan     0.1000   -0.0008
##    240        0.1074             nan     0.1000   -0.0006
##    260        0.0946             nan     0.1000   -0.0003
##    280        0.0838             nan     0.1000   -0.0002
##    300        0.0740             nan     0.1000    0.0001
##    320        0.0650             nan     0.1000   -0.0002
##    340        0.0575             nan     0.1000   -0.0002
##    360        0.0512             nan     0.1000   -0.0002
##    380        0.0458             nan     0.1000   -0.0002
##    400        0.0405             nan     0.1000   -0.0001
##    420        0.0357             nan     0.1000   -0.0001
##    440        0.0322             nan     0.1000   -0.0002
##    460        0.0290             nan     0.1000   -0.0002
##    480        0.0259             nan     0.1000   -0.0002
##    500        0.0233             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3205             nan     0.0010    0.0003
##      2        1.3196             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0004
##      4        1.3179             nan     0.0010    0.0004
##      5        1.3172             nan     0.0010    0.0003
##      6        1.3164             nan     0.0010    0.0004
##      7        1.3156             nan     0.0010    0.0004
##      8        1.3148             nan     0.0010    0.0004
##      9        1.3140             nan     0.0010    0.0004
##     10        1.3131             nan     0.0010    0.0004
##     20        1.3053             nan     0.0010    0.0004
##     40        1.2892             nan     0.0010    0.0003
##     60        1.2738             nan     0.0010    0.0003
##     80        1.2591             nan     0.0010    0.0003
##    100        1.2446             nan     0.0010    0.0003
##    120        1.2309             nan     0.0010    0.0003
##    140        1.2174             nan     0.0010    0.0003
##    160        1.2041             nan     0.0010    0.0003
##    180        1.1916             nan     0.0010    0.0002
##    200        1.1797             nan     0.0010    0.0002
##    220        1.1681             nan     0.0010    0.0003
##    240        1.1567             nan     0.0010    0.0003
##    260        1.1457             nan     0.0010    0.0002
##    280        1.1349             nan     0.0010    0.0002
##    300        1.1245             nan     0.0010    0.0002
##    320        1.1141             nan     0.0010    0.0002
##    340        1.1044             nan     0.0010    0.0002
##    360        1.0946             nan     0.0010    0.0002
##    380        1.0852             nan     0.0010    0.0002
##    400        1.0759             nan     0.0010    0.0002
##    420        1.0671             nan     0.0010    0.0002
##    440        1.0588             nan     0.0010    0.0001
##    460        1.0504             nan     0.0010    0.0002
##    480        1.0421             nan     0.0010    0.0002
##    500        1.0340             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0003
##      4        1.3179             nan     0.0010    0.0003
##      5        1.3170             nan     0.0010    0.0004
##      6        1.3161             nan     0.0010    0.0004
##      7        1.3152             nan     0.0010    0.0004
##      8        1.3145             nan     0.0010    0.0003
##      9        1.3136             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3046             nan     0.0010    0.0004
##     40        1.2890             nan     0.0010    0.0004
##     60        1.2736             nan     0.0010    0.0003
##     80        1.2588             nan     0.0010    0.0004
##    100        1.2446             nan     0.0010    0.0003
##    120        1.2310             nan     0.0010    0.0003
##    140        1.2176             nan     0.0010    0.0003
##    160        1.2046             nan     0.0010    0.0003
##    180        1.1920             nan     0.0010    0.0003
##    200        1.1797             nan     0.0010    0.0003
##    220        1.1680             nan     0.0010    0.0003
##    240        1.1568             nan     0.0010    0.0002
##    260        1.1457             nan     0.0010    0.0002
##    280        1.1348             nan     0.0010    0.0002
##    300        1.1243             nan     0.0010    0.0002
##    320        1.1142             nan     0.0010    0.0002
##    340        1.1042             nan     0.0010    0.0002
##    360        1.0947             nan     0.0010    0.0002
##    380        1.0852             nan     0.0010    0.0002
##    400        1.0760             nan     0.0010    0.0002
##    420        1.0673             nan     0.0010    0.0002
##    440        1.0586             nan     0.0010    0.0002
##    460        1.0505             nan     0.0010    0.0002
##    480        1.0425             nan     0.0010    0.0002
##    500        1.0346             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0004
##      4        1.3180             nan     0.0010    0.0004
##      5        1.3171             nan     0.0010    0.0004
##      6        1.3164             nan     0.0010    0.0004
##      7        1.3155             nan     0.0010    0.0004
##      8        1.3147             nan     0.0010    0.0004
##      9        1.3139             nan     0.0010    0.0004
##     10        1.3131             nan     0.0010    0.0004
##     20        1.3053             nan     0.0010    0.0004
##     40        1.2897             nan     0.0010    0.0003
##     60        1.2747             nan     0.0010    0.0003
##     80        1.2601             nan     0.0010    0.0003
##    100        1.2457             nan     0.0010    0.0003
##    120        1.2319             nan     0.0010    0.0003
##    140        1.2189             nan     0.0010    0.0003
##    160        1.2060             nan     0.0010    0.0003
##    180        1.1935             nan     0.0010    0.0003
##    200        1.1814             nan     0.0010    0.0002
##    220        1.1696             nan     0.0010    0.0003
##    240        1.1582             nan     0.0010    0.0003
##    260        1.1473             nan     0.0010    0.0002
##    280        1.1364             nan     0.0010    0.0002
##    300        1.1257             nan     0.0010    0.0003
##    320        1.1157             nan     0.0010    0.0002
##    340        1.1058             nan     0.0010    0.0002
##    360        1.0961             nan     0.0010    0.0002
##    380        1.0866             nan     0.0010    0.0002
##    400        1.0777             nan     0.0010    0.0002
##    420        1.0687             nan     0.0010    0.0002
##    440        1.0599             nan     0.0010    0.0001
##    460        1.0514             nan     0.0010    0.0002
##    480        1.0434             nan     0.0010    0.0002
##    500        1.0353             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3177             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3143             nan     0.0010    0.0003
##      9        1.3135             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0004
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2869             nan     0.0010    0.0004
##     60        1.2706             nan     0.0010    0.0003
##     80        1.2550             nan     0.0010    0.0003
##    100        1.2395             nan     0.0010    0.0003
##    120        1.2247             nan     0.0010    0.0004
##    140        1.2106             nan     0.0010    0.0003
##    160        1.1967             nan     0.0010    0.0003
##    180        1.1836             nan     0.0010    0.0003
##    200        1.1705             nan     0.0010    0.0003
##    220        1.1581             nan     0.0010    0.0003
##    240        1.1458             nan     0.0010    0.0003
##    260        1.1340             nan     0.0010    0.0002
##    280        1.1224             nan     0.0010    0.0002
##    300        1.1112             nan     0.0010    0.0002
##    320        1.1002             nan     0.0010    0.0002
##    340        1.0895             nan     0.0010    0.0002
##    360        1.0796             nan     0.0010    0.0002
##    380        1.0695             nan     0.0010    0.0002
##    400        1.0596             nan     0.0010    0.0002
##    420        1.0501             nan     0.0010    0.0002
##    440        1.0411             nan     0.0010    0.0002
##    460        1.0319             nan     0.0010    0.0002
##    480        1.0233             nan     0.0010    0.0002
##    500        1.0149             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3187             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3169             nan     0.0010    0.0004
##      6        1.3161             nan     0.0010    0.0004
##      7        1.3152             nan     0.0010    0.0004
##      8        1.3144             nan     0.0010    0.0004
##      9        1.3136             nan     0.0010    0.0004
##     10        1.3128             nan     0.0010    0.0004
##     20        1.3040             nan     0.0010    0.0004
##     40        1.2869             nan     0.0010    0.0003
##     60        1.2711             nan     0.0010    0.0003
##     80        1.2557             nan     0.0010    0.0004
##    100        1.2406             nan     0.0010    0.0004
##    120        1.2258             nan     0.0010    0.0003
##    140        1.2118             nan     0.0010    0.0003
##    160        1.1983             nan     0.0010    0.0003
##    180        1.1848             nan     0.0010    0.0003
##    200        1.1720             nan     0.0010    0.0003
##    220        1.1594             nan     0.0010    0.0003
##    240        1.1473             nan     0.0010    0.0002
##    260        1.1356             nan     0.0010    0.0002
##    280        1.1241             nan     0.0010    0.0003
##    300        1.1130             nan     0.0010    0.0002
##    320        1.1021             nan     0.0010    0.0002
##    340        1.0914             nan     0.0010    0.0002
##    360        1.0810             nan     0.0010    0.0002
##    380        1.0707             nan     0.0010    0.0002
##    400        1.0611             nan     0.0010    0.0002
##    420        1.0517             nan     0.0010    0.0002
##    440        1.0425             nan     0.0010    0.0002
##    460        1.0340             nan     0.0010    0.0002
##    480        1.0252             nan     0.0010    0.0002
##    500        1.0168             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3160             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0004
##      9        1.3134             nan     0.0010    0.0003
##     10        1.3126             nan     0.0010    0.0003
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2871             nan     0.0010    0.0003
##     60        1.2709             nan     0.0010    0.0004
##     80        1.2553             nan     0.0010    0.0003
##    100        1.2400             nan     0.0010    0.0004
##    120        1.2256             nan     0.0010    0.0003
##    140        1.2115             nan     0.0010    0.0003
##    160        1.1978             nan     0.0010    0.0003
##    180        1.1847             nan     0.0010    0.0003
##    200        1.1720             nan     0.0010    0.0003
##    220        1.1596             nan     0.0010    0.0003
##    240        1.1477             nan     0.0010    0.0003
##    260        1.1357             nan     0.0010    0.0003
##    280        1.1241             nan     0.0010    0.0002
##    300        1.1132             nan     0.0010    0.0002
##    320        1.1023             nan     0.0010    0.0002
##    340        1.0920             nan     0.0010    0.0002
##    360        1.0821             nan     0.0010    0.0002
##    380        1.0721             nan     0.0010    0.0002
##    400        1.0624             nan     0.0010    0.0002
##    420        1.0530             nan     0.0010    0.0002
##    440        1.0437             nan     0.0010    0.0002
##    460        1.0344             nan     0.0010    0.0002
##    480        1.0259             nan     0.0010    0.0002
##    500        1.0175             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0005
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2678             nan     0.0010    0.0003
##     80        1.2513             nan     0.0010    0.0004
##    100        1.2353             nan     0.0010    0.0004
##    120        1.2196             nan     0.0010    0.0003
##    140        1.2046             nan     0.0010    0.0004
##    160        1.1900             nan     0.0010    0.0003
##    180        1.1758             nan     0.0010    0.0004
##    200        1.1623             nan     0.0010    0.0003
##    220        1.1496             nan     0.0010    0.0003
##    240        1.1370             nan     0.0010    0.0003
##    260        1.1249             nan     0.0010    0.0003
##    280        1.1128             nan     0.0010    0.0003
##    300        1.1012             nan     0.0010    0.0003
##    320        1.0899             nan     0.0010    0.0002
##    340        1.0789             nan     0.0010    0.0003
##    360        1.0685             nan     0.0010    0.0002
##    380        1.0581             nan     0.0010    0.0002
##    400        1.0480             nan     0.0010    0.0002
##    420        1.0383             nan     0.0010    0.0002
##    440        1.0291             nan     0.0010    0.0002
##    460        1.0198             nan     0.0010    0.0002
##    480        1.0109             nan     0.0010    0.0002
##    500        1.0019             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0003
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0005
##     20        1.3031             nan     0.0010    0.0004
##     40        1.2854             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0003
##     80        1.2520             nan     0.0010    0.0003
##    100        1.2358             nan     0.0010    0.0003
##    120        1.2206             nan     0.0010    0.0004
##    140        1.2059             nan     0.0010    0.0003
##    160        1.1918             nan     0.0010    0.0003
##    180        1.1778             nan     0.0010    0.0003
##    200        1.1643             nan     0.0010    0.0003
##    220        1.1512             nan     0.0010    0.0003
##    240        1.1388             nan     0.0010    0.0003
##    260        1.1266             nan     0.0010    0.0002
##    280        1.1146             nan     0.0010    0.0003
##    300        1.1029             nan     0.0010    0.0002
##    320        1.0919             nan     0.0010    0.0002
##    340        1.0810             nan     0.0010    0.0002
##    360        1.0707             nan     0.0010    0.0002
##    380        1.0605             nan     0.0010    0.0002
##    400        1.0508             nan     0.0010    0.0002
##    420        1.0412             nan     0.0010    0.0002
##    440        1.0318             nan     0.0010    0.0002
##    460        1.0223             nan     0.0010    0.0002
##    480        1.0132             nan     0.0010    0.0002
##    500        1.0045             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0003
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2851             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0004
##     80        1.2525             nan     0.0010    0.0004
##    100        1.2369             nan     0.0010    0.0004
##    120        1.2216             nan     0.0010    0.0003
##    140        1.2072             nan     0.0010    0.0003
##    160        1.1935             nan     0.0010    0.0002
##    180        1.1798             nan     0.0010    0.0003
##    200        1.1665             nan     0.0010    0.0003
##    220        1.1538             nan     0.0010    0.0002
##    240        1.1413             nan     0.0010    0.0003
##    260        1.1292             nan     0.0010    0.0002
##    280        1.1177             nan     0.0010    0.0002
##    300        1.1059             nan     0.0010    0.0003
##    320        1.0947             nan     0.0010    0.0002
##    340        1.0840             nan     0.0010    0.0002
##    360        1.0736             nan     0.0010    0.0003
##    380        1.0633             nan     0.0010    0.0002
##    400        1.0532             nan     0.0010    0.0002
##    420        1.0434             nan     0.0010    0.0002
##    440        1.0337             nan     0.0010    0.0002
##    460        1.0245             nan     0.0010    0.0002
##    480        1.0157             nan     0.0010    0.0001
##    500        1.0070             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0043
##      2        1.3036             nan     0.0100    0.0039
##      3        1.2959             nan     0.0100    0.0034
##      4        1.2880             nan     0.0100    0.0038
##      5        1.2807             nan     0.0100    0.0031
##      6        1.2726             nan     0.0100    0.0038
##      7        1.2652             nan     0.0100    0.0033
##      8        1.2574             nan     0.0100    0.0035
##      9        1.2504             nan     0.0100    0.0031
##     10        1.2435             nan     0.0100    0.0030
##     20        1.1791             nan     0.0100    0.0029
##     40        1.0735             nan     0.0100    0.0020
##     60        0.9945             nan     0.0100    0.0016
##     80        0.9333             nan     0.0100    0.0012
##    100        0.8842             nan     0.0100    0.0008
##    120        0.8432             nan     0.0100    0.0006
##    140        0.8089             nan     0.0100    0.0005
##    160        0.7786             nan     0.0100    0.0003
##    180        0.7524             nan     0.0100    0.0003
##    200        0.7298             nan     0.0100    0.0003
##    220        0.7097             nan     0.0100    0.0003
##    240        0.6914             nan     0.0100    0.0003
##    260        0.6757             nan     0.0100   -0.0001
##    280        0.6616             nan     0.0100    0.0000
##    300        0.6490             nan     0.0100    0.0000
##    320        0.6367             nan     0.0100    0.0001
##    340        0.6247             nan     0.0100    0.0000
##    360        0.6136             nan     0.0100   -0.0001
##    380        0.6030             nan     0.0100   -0.0000
##    400        0.5928             nan     0.0100   -0.0002
##    420        0.5825             nan     0.0100   -0.0002
##    440        0.5737             nan     0.0100    0.0001
##    460        0.5642             nan     0.0100    0.0000
##    480        0.5551             nan     0.0100    0.0000
##    500        0.5465             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3129             nan     0.0100    0.0036
##      2        1.3057             nan     0.0100    0.0033
##      3        1.2979             nan     0.0100    0.0035
##      4        1.2906             nan     0.0100    0.0030
##      5        1.2839             nan     0.0100    0.0033
##      6        1.2764             nan     0.0100    0.0035
##      7        1.2697             nan     0.0100    0.0028
##      8        1.2619             nan     0.0100    0.0038
##      9        1.2545             nan     0.0100    0.0035
##     10        1.2471             nan     0.0100    0.0035
##     20        1.1810             nan     0.0100    0.0028
##     40        1.0764             nan     0.0100    0.0020
##     60        0.9969             nan     0.0100    0.0015
##     80        0.9351             nan     0.0100    0.0012
##    100        0.8833             nan     0.0100    0.0008
##    120        0.8428             nan     0.0100    0.0005
##    140        0.8092             nan     0.0100    0.0005
##    160        0.7803             nan     0.0100    0.0006
##    180        0.7541             nan     0.0100    0.0002
##    200        0.7322             nan     0.0100    0.0003
##    220        0.7126             nan     0.0100    0.0004
##    240        0.6954             nan     0.0100    0.0000
##    260        0.6814             nan     0.0100   -0.0001
##    280        0.6658             nan     0.0100    0.0002
##    300        0.6535             nan     0.0100   -0.0000
##    320        0.6410             nan     0.0100    0.0003
##    340        0.6295             nan     0.0100    0.0000
##    360        0.6187             nan     0.0100    0.0001
##    380        0.6074             nan     0.0100    0.0000
##    400        0.5972             nan     0.0100    0.0001
##    420        0.5867             nan     0.0100    0.0002
##    440        0.5780             nan     0.0100   -0.0001
##    460        0.5686             nan     0.0100   -0.0002
##    480        0.5602             nan     0.0100   -0.0001
##    500        0.5519             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3130             nan     0.0100    0.0038
##      2        1.3049             nan     0.0100    0.0033
##      3        1.2971             nan     0.0100    0.0032
##      4        1.2893             nan     0.0100    0.0031
##      5        1.2823             nan     0.0100    0.0033
##      6        1.2747             nan     0.0100    0.0034
##      7        1.2680             nan     0.0100    0.0032
##      8        1.2608             nan     0.0100    0.0035
##      9        1.2541             nan     0.0100    0.0028
##     10        1.2471             nan     0.0100    0.0028
##     20        1.1814             nan     0.0100    0.0025
##     40        1.0768             nan     0.0100    0.0019
##     60        0.9979             nan     0.0100    0.0015
##     80        0.9341             nan     0.0100    0.0012
##    100        0.8837             nan     0.0100    0.0008
##    120        0.8430             nan     0.0100    0.0006
##    140        0.8092             nan     0.0100    0.0005
##    160        0.7811             nan     0.0100    0.0004
##    180        0.7586             nan     0.0100    0.0001
##    200        0.7367             nan     0.0100    0.0001
##    220        0.7184             nan     0.0100    0.0002
##    240        0.7010             nan     0.0100    0.0003
##    260        0.6863             nan     0.0100    0.0001
##    280        0.6725             nan     0.0100    0.0002
##    300        0.6595             nan     0.0100   -0.0000
##    320        0.6475             nan     0.0100    0.0001
##    340        0.6363             nan     0.0100    0.0001
##    360        0.6250             nan     0.0100   -0.0000
##    380        0.6136             nan     0.0100    0.0000
##    400        0.6038             nan     0.0100    0.0001
##    420        0.5955             nan     0.0100   -0.0001
##    440        0.5865             nan     0.0100    0.0002
##    460        0.5781             nan     0.0100   -0.0000
##    480        0.5701             nan     0.0100   -0.0001
##    500        0.5626             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0044
##      2        1.3038             nan     0.0100    0.0037
##      3        1.2946             nan     0.0100    0.0042
##      4        1.2865             nan     0.0100    0.0037
##      5        1.2781             nan     0.0100    0.0036
##      6        1.2700             nan     0.0100    0.0034
##      7        1.2623             nan     0.0100    0.0030
##      8        1.2534             nan     0.0100    0.0037
##      9        1.2463             nan     0.0100    0.0032
##     10        1.2388             nan     0.0100    0.0031
##     20        1.1702             nan     0.0100    0.0026
##     40        1.0608             nan     0.0100    0.0022
##     60        0.9742             nan     0.0100    0.0015
##     80        0.9090             nan     0.0100    0.0012
##    100        0.8571             nan     0.0100    0.0008
##    120        0.8145             nan     0.0100    0.0006
##    140        0.7755             nan     0.0100    0.0005
##    160        0.7447             nan     0.0100    0.0005
##    180        0.7187             nan     0.0100    0.0004
##    200        0.6945             nan     0.0100    0.0002
##    220        0.6724             nan     0.0100    0.0002
##    240        0.6533             nan     0.0100    0.0001
##    260        0.6344             nan     0.0100    0.0002
##    280        0.6177             nan     0.0100    0.0000
##    300        0.6027             nan     0.0100    0.0000
##    320        0.5893             nan     0.0100    0.0000
##    340        0.5765             nan     0.0100    0.0001
##    360        0.5637             nan     0.0100    0.0002
##    380        0.5518             nan     0.0100    0.0001
##    400        0.5408             nan     0.0100   -0.0000
##    420        0.5296             nan     0.0100   -0.0000
##    440        0.5194             nan     0.0100    0.0001
##    460        0.5090             nan     0.0100   -0.0000
##    480        0.5000             nan     0.0100   -0.0001
##    500        0.4905             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0038
##      2        1.3040             nan     0.0100    0.0034
##      3        1.2956             nan     0.0100    0.0038
##      4        1.2864             nan     0.0100    0.0044
##      5        1.2790             nan     0.0100    0.0034
##      6        1.2714             nan     0.0100    0.0035
##      7        1.2627             nan     0.0100    0.0040
##      8        1.2554             nan     0.0100    0.0031
##      9        1.2482             nan     0.0100    0.0029
##     10        1.2405             nan     0.0100    0.0035
##     20        1.1708             nan     0.0100    0.0032
##     40        1.0604             nan     0.0100    0.0023
##     60        0.9772             nan     0.0100    0.0018
##     80        0.9113             nan     0.0100    0.0012
##    100        0.8581             nan     0.0100    0.0009
##    120        0.8139             nan     0.0100    0.0008
##    140        0.7782             nan     0.0100    0.0005
##    160        0.7469             nan     0.0100    0.0004
##    180        0.7205             nan     0.0100    0.0003
##    200        0.6972             nan     0.0100    0.0002
##    220        0.6759             nan     0.0100    0.0001
##    240        0.6570             nan     0.0100    0.0004
##    260        0.6402             nan     0.0100   -0.0000
##    280        0.6250             nan     0.0100    0.0000
##    300        0.6109             nan     0.0100   -0.0001
##    320        0.5972             nan     0.0100    0.0002
##    340        0.5838             nan     0.0100    0.0004
##    360        0.5721             nan     0.0100    0.0003
##    380        0.5600             nan     0.0100   -0.0001
##    400        0.5491             nan     0.0100    0.0001
##    420        0.5393             nan     0.0100    0.0001
##    440        0.5289             nan     0.0100   -0.0001
##    460        0.5192             nan     0.0100    0.0000
##    480        0.5095             nan     0.0100   -0.0000
##    500        0.5005             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0038
##      2        1.3038             nan     0.0100    0.0035
##      3        1.2954             nan     0.0100    0.0033
##      4        1.2876             nan     0.0100    0.0033
##      5        1.2793             nan     0.0100    0.0035
##      6        1.2722             nan     0.0100    0.0033
##      7        1.2645             nan     0.0100    0.0035
##      8        1.2557             nan     0.0100    0.0041
##      9        1.2471             nan     0.0100    0.0039
##     10        1.2390             nan     0.0100    0.0033
##     20        1.1718             nan     0.0100    0.0029
##     40        1.0603             nan     0.0100    0.0021
##     60        0.9801             nan     0.0100    0.0016
##     80        0.9140             nan     0.0100    0.0010
##    100        0.8608             nan     0.0100    0.0008
##    120        0.8182             nan     0.0100    0.0007
##    140        0.7837             nan     0.0100    0.0005
##    160        0.7539             nan     0.0100    0.0002
##    180        0.7276             nan     0.0100    0.0003
##    200        0.7034             nan     0.0100    0.0003
##    220        0.6825             nan     0.0100    0.0002
##    240        0.6633             nan     0.0100    0.0001
##    260        0.6476             nan     0.0100    0.0001
##    280        0.6318             nan     0.0100    0.0001
##    300        0.6173             nan     0.0100   -0.0002
##    320        0.6036             nan     0.0100    0.0001
##    340        0.5909             nan     0.0100   -0.0001
##    360        0.5791             nan     0.0100   -0.0000
##    380        0.5680             nan     0.0100    0.0001
##    400        0.5577             nan     0.0100   -0.0000
##    420        0.5475             nan     0.0100   -0.0001
##    440        0.5371             nan     0.0100    0.0000
##    460        0.5272             nan     0.0100   -0.0001
##    480        0.5173             nan     0.0100   -0.0001
##    500        0.5072             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0042
##      2        1.3030             nan     0.0100    0.0039
##      3        1.2947             nan     0.0100    0.0041
##      4        1.2866             nan     0.0100    0.0033
##      5        1.2771             nan     0.0100    0.0038
##      6        1.2688             nan     0.0100    0.0037
##      7        1.2607             nan     0.0100    0.0036
##      8        1.2528             nan     0.0100    0.0034
##      9        1.2443             nan     0.0100    0.0042
##     10        1.2359             nan     0.0100    0.0036
##     20        1.1631             nan     0.0100    0.0033
##     40        1.0477             nan     0.0100    0.0019
##     60        0.9605             nan     0.0100    0.0016
##     80        0.8927             nan     0.0100    0.0009
##    100        0.8355             nan     0.0100    0.0011
##    120        0.7902             nan     0.0100    0.0005
##    140        0.7516             nan     0.0100    0.0006
##    160        0.7171             nan     0.0100    0.0005
##    180        0.6882             nan     0.0100    0.0003
##    200        0.6625             nan     0.0100    0.0003
##    220        0.6398             nan     0.0100    0.0003
##    240        0.6190             nan     0.0100    0.0000
##    260        0.5993             nan     0.0100    0.0001
##    280        0.5817             nan     0.0100   -0.0001
##    300        0.5650             nan     0.0100    0.0001
##    320        0.5502             nan     0.0100    0.0001
##    340        0.5363             nan     0.0100   -0.0000
##    360        0.5233             nan     0.0100    0.0000
##    380        0.5096             nan     0.0100   -0.0001
##    400        0.4965             nan     0.0100    0.0000
##    420        0.4852             nan     0.0100   -0.0001
##    440        0.4741             nan     0.0100    0.0000
##    460        0.4627             nan     0.0100    0.0000
##    480        0.4523             nan     0.0100   -0.0000
##    500        0.4419             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0048
##      2        1.3029             nan     0.0100    0.0037
##      3        1.2936             nan     0.0100    0.0040
##      4        1.2846             nan     0.0100    0.0038
##      5        1.2754             nan     0.0100    0.0037
##      6        1.2664             nan     0.0100    0.0043
##      7        1.2584             nan     0.0100    0.0036
##      8        1.2508             nan     0.0100    0.0034
##      9        1.2418             nan     0.0100    0.0040
##     10        1.2339             nan     0.0100    0.0032
##     20        1.1629             nan     0.0100    0.0030
##     40        1.0495             nan     0.0100    0.0022
##     60        0.9608             nan     0.0100    0.0018
##     80        0.8916             nan     0.0100    0.0012
##    100        0.8383             nan     0.0100    0.0009
##    120        0.7937             nan     0.0100    0.0007
##    140        0.7566             nan     0.0100    0.0006
##    160        0.7232             nan     0.0100    0.0004
##    180        0.6944             nan     0.0100    0.0004
##    200        0.6705             nan     0.0100    0.0001
##    220        0.6494             nan     0.0100    0.0000
##    240        0.6283             nan     0.0100    0.0003
##    260        0.6087             nan     0.0100   -0.0000
##    280        0.5914             nan     0.0100   -0.0000
##    300        0.5756             nan     0.0100    0.0000
##    320        0.5611             nan     0.0100   -0.0001
##    340        0.5464             nan     0.0100   -0.0000
##    360        0.5338             nan     0.0100   -0.0000
##    380        0.5207             nan     0.0100   -0.0000
##    400        0.5079             nan     0.0100   -0.0002
##    420        0.4964             nan     0.0100    0.0000
##    440        0.4847             nan     0.0100    0.0001
##    460        0.4738             nan     0.0100   -0.0001
##    480        0.4637             nan     0.0100    0.0001
##    500        0.4532             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3128             nan     0.0100    0.0037
##      2        1.3045             nan     0.0100    0.0036
##      3        1.2961             nan     0.0100    0.0034
##      4        1.2880             nan     0.0100    0.0037
##      5        1.2799             nan     0.0100    0.0037
##      6        1.2710             nan     0.0100    0.0039
##      7        1.2627             nan     0.0100    0.0034
##      8        1.2544             nan     0.0100    0.0039
##      9        1.2466             nan     0.0100    0.0034
##     10        1.2388             nan     0.0100    0.0038
##     20        1.1685             nan     0.0100    0.0026
##     40        1.0516             nan     0.0100    0.0024
##     60        0.9663             nan     0.0100    0.0017
##     80        0.8980             nan     0.0100    0.0010
##    100        0.8418             nan     0.0100    0.0008
##    120        0.7976             nan     0.0100    0.0008
##    140        0.7607             nan     0.0100    0.0006
##    160        0.7300             nan     0.0100    0.0003
##    180        0.7017             nan     0.0100    0.0005
##    200        0.6774             nan     0.0100    0.0001
##    220        0.6575             nan     0.0100    0.0001
##    240        0.6381             nan     0.0100    0.0002
##    260        0.6189             nan     0.0100    0.0003
##    280        0.6012             nan     0.0100    0.0000
##    300        0.5855             nan     0.0100    0.0005
##    320        0.5707             nan     0.0100    0.0002
##    340        0.5562             nan     0.0100    0.0001
##    360        0.5427             nan     0.0100   -0.0000
##    380        0.5296             nan     0.0100    0.0001
##    400        0.5170             nan     0.0100    0.0001
##    420        0.5059             nan     0.0100    0.0000
##    440        0.4948             nan     0.0100   -0.0002
##    460        0.4836             nan     0.0100    0.0001
##    480        0.4722             nan     0.0100   -0.0000
##    500        0.4625             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2391             nan     0.1000    0.0401
##      2        1.1674             nan     0.1000    0.0311
##      3        1.1199             nan     0.1000    0.0173
##      4        1.0669             nan     0.1000    0.0223
##      5        1.0269             nan     0.1000    0.0177
##      6        0.9941             nan     0.1000    0.0127
##      7        0.9601             nan     0.1000    0.0157
##      8        0.9312             nan     0.1000    0.0117
##      9        0.9063             nan     0.1000    0.0085
##     10        0.8880             nan     0.1000    0.0071
##     20        0.7333             nan     0.1000    0.0049
##     40        0.5907             nan     0.1000   -0.0005
##     60        0.5052             nan     0.1000   -0.0018
##     80        0.4351             nan     0.1000   -0.0003
##    100        0.3833             nan     0.1000   -0.0009
##    120        0.3429             nan     0.1000   -0.0010
##    140        0.3058             nan     0.1000    0.0004
##    160        0.2714             nan     0.1000   -0.0002
##    180        0.2432             nan     0.1000   -0.0002
##    200        0.2183             nan     0.1000   -0.0004
##    220        0.1952             nan     0.1000   -0.0000
##    240        0.1751             nan     0.1000   -0.0005
##    260        0.1574             nan     0.1000    0.0000
##    280        0.1413             nan     0.1000   -0.0002
##    300        0.1276             nan     0.1000   -0.0006
##    320        0.1161             nan     0.1000   -0.0002
##    340        0.1066             nan     0.1000   -0.0001
##    360        0.0988             nan     0.1000    0.0001
##    380        0.0902             nan     0.1000   -0.0001
##    400        0.0819             nan     0.1000   -0.0002
##    420        0.0752             nan     0.1000   -0.0001
##    440        0.0690             nan     0.1000   -0.0003
##    460        0.0632             nan     0.1000    0.0001
##    480        0.0582             nan     0.1000   -0.0003
##    500        0.0533             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2442             nan     0.1000    0.0357
##      2        1.1832             nan     0.1000    0.0264
##      3        1.1289             nan     0.1000    0.0244
##      4        1.0775             nan     0.1000    0.0240
##      5        1.0320             nan     0.1000    0.0196
##      6        0.9941             nan     0.1000    0.0171
##      7        0.9614             nan     0.1000    0.0127
##      8        0.9324             nan     0.1000    0.0104
##      9        0.9084             nan     0.1000    0.0091
##     10        0.8821             nan     0.1000    0.0080
##     20        0.7443             nan     0.1000    0.0013
##     40        0.6162             nan     0.1000   -0.0018
##     60        0.5226             nan     0.1000   -0.0034
##     80        0.4514             nan     0.1000    0.0002
##    100        0.3994             nan     0.1000   -0.0014
##    120        0.3545             nan     0.1000   -0.0001
##    140        0.3176             nan     0.1000   -0.0009
##    160        0.2854             nan     0.1000   -0.0002
##    180        0.2552             nan     0.1000    0.0000
##    200        0.2288             nan     0.1000   -0.0003
##    220        0.2069             nan     0.1000   -0.0009
##    240        0.1861             nan     0.1000   -0.0004
##    260        0.1709             nan     0.1000   -0.0012
##    280        0.1572             nan     0.1000   -0.0006
##    300        0.1414             nan     0.1000   -0.0002
##    320        0.1298             nan     0.1000   -0.0003
##    340        0.1190             nan     0.1000   -0.0004
##    360        0.1097             nan     0.1000   -0.0003
##    380        0.1011             nan     0.1000   -0.0002
##    400        0.0934             nan     0.1000   -0.0004
##    420        0.0857             nan     0.1000   -0.0002
##    440        0.0799             nan     0.1000   -0.0002
##    460        0.0730             nan     0.1000    0.0001
##    480        0.0673             nan     0.1000   -0.0003
##    500        0.0622             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2449             nan     0.1000    0.0316
##      2        1.1806             nan     0.1000    0.0264
##      3        1.1271             nan     0.1000    0.0242
##      4        1.0771             nan     0.1000    0.0229
##      5        1.0351             nan     0.1000    0.0175
##      6        0.9959             nan     0.1000    0.0179
##      7        0.9596             nan     0.1000    0.0170
##      8        0.9322             nan     0.1000    0.0101
##      9        0.9097             nan     0.1000    0.0081
##     10        0.8848             nan     0.1000    0.0088
##     20        0.7486             nan     0.1000    0.0022
##     40        0.6187             nan     0.1000   -0.0015
##     60        0.5322             nan     0.1000   -0.0008
##     80        0.4675             nan     0.1000    0.0012
##    100        0.4131             nan     0.1000   -0.0002
##    120        0.3656             nan     0.1000   -0.0003
##    140        0.3288             nan     0.1000   -0.0008
##    160        0.2927             nan     0.1000   -0.0008
##    180        0.2652             nan     0.1000   -0.0009
##    200        0.2418             nan     0.1000   -0.0003
##    220        0.2200             nan     0.1000   -0.0005
##    240        0.2016             nan     0.1000   -0.0008
##    260        0.1860             nan     0.1000   -0.0013
##    280        0.1680             nan     0.1000   -0.0001
##    300        0.1547             nan     0.1000   -0.0006
##    320        0.1416             nan     0.1000   -0.0004
##    340        0.1306             nan     0.1000   -0.0003
##    360        0.1208             nan     0.1000    0.0003
##    380        0.1118             nan     0.1000   -0.0003
##    400        0.1030             nan     0.1000   -0.0003
##    420        0.0953             nan     0.1000   -0.0002
##    440        0.0872             nan     0.1000    0.0001
##    460        0.0798             nan     0.1000   -0.0002
##    480        0.0742             nan     0.1000   -0.0004
##    500        0.0688             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2284             nan     0.1000    0.0418
##      2        1.1607             nan     0.1000    0.0262
##      3        1.1056             nan     0.1000    0.0179
##      4        1.0527             nan     0.1000    0.0234
##      5        1.0120             nan     0.1000    0.0155
##      6        0.9715             nan     0.1000    0.0171
##      7        0.9378             nan     0.1000    0.0132
##      8        0.9013             nan     0.1000    0.0151
##      9        0.8768             nan     0.1000    0.0063
##     10        0.8572             nan     0.1000    0.0070
##     20        0.6932             nan     0.1000    0.0031
##     40        0.5408             nan     0.1000    0.0002
##     60        0.4604             nan     0.1000   -0.0014
##     80        0.3817             nan     0.1000   -0.0015
##    100        0.3252             nan     0.1000    0.0001
##    120        0.2802             nan     0.1000   -0.0008
##    140        0.2428             nan     0.1000   -0.0004
##    160        0.2098             nan     0.1000   -0.0006
##    180        0.1821             nan     0.1000    0.0002
##    200        0.1603             nan     0.1000   -0.0002
##    220        0.1396             nan     0.1000    0.0001
##    240        0.1226             nan     0.1000   -0.0005
##    260        0.1083             nan     0.1000   -0.0003
##    280        0.0966             nan     0.1000   -0.0003
##    300        0.0873             nan     0.1000   -0.0002
##    320        0.0789             nan     0.1000   -0.0002
##    340        0.0701             nan     0.1000    0.0001
##    360        0.0624             nan     0.1000   -0.0000
##    380        0.0561             nan     0.1000   -0.0001
##    400        0.0507             nan     0.1000   -0.0001
##    420        0.0452             nan     0.1000   -0.0000
##    440        0.0410             nan     0.1000   -0.0001
##    460        0.0367             nan     0.1000   -0.0001
##    480        0.0328             nan     0.1000   -0.0001
##    500        0.0298             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2302             nan     0.1000    0.0435
##      2        1.1650             nan     0.1000    0.0295
##      3        1.1131             nan     0.1000    0.0217
##      4        1.0603             nan     0.1000    0.0229
##      5        1.0176             nan     0.1000    0.0204
##      6        0.9806             nan     0.1000    0.0145
##      7        0.9426             nan     0.1000    0.0168
##      8        0.9123             nan     0.1000    0.0138
##      9        0.8847             nan     0.1000    0.0108
##     10        0.8552             nan     0.1000    0.0106
##     20        0.7055             nan     0.1000    0.0001
##     40        0.5540             nan     0.1000    0.0029
##     60        0.4641             nan     0.1000   -0.0013
##     80        0.3881             nan     0.1000   -0.0009
##    100        0.3293             nan     0.1000   -0.0007
##    120        0.2851             nan     0.1000   -0.0007
##    140        0.2461             nan     0.1000   -0.0011
##    160        0.2174             nan     0.1000   -0.0001
##    180        0.1863             nan     0.1000   -0.0008
##    200        0.1639             nan     0.1000   -0.0002
##    220        0.1451             nan     0.1000   -0.0003
##    240        0.1284             nan     0.1000   -0.0002
##    260        0.1148             nan     0.1000   -0.0002
##    280        0.1008             nan     0.1000   -0.0004
##    300        0.0905             nan     0.1000   -0.0001
##    320        0.0793             nan     0.1000   -0.0001
##    340        0.0708             nan     0.1000   -0.0002
##    360        0.0632             nan     0.1000   -0.0003
##    380        0.0559             nan     0.1000   -0.0001
##    400        0.0506             nan     0.1000   -0.0002
##    420        0.0461             nan     0.1000   -0.0002
##    440        0.0414             nan     0.1000   -0.0001
##    460        0.0374             nan     0.1000   -0.0001
##    480        0.0336             nan     0.1000   -0.0000
##    500        0.0303             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2437             nan     0.1000    0.0360
##      2        1.1636             nan     0.1000    0.0357
##      3        1.1068             nan     0.1000    0.0253
##      4        1.0531             nan     0.1000    0.0252
##      5        1.0096             nan     0.1000    0.0190
##      6        0.9771             nan     0.1000    0.0142
##      7        0.9412             nan     0.1000    0.0121
##      8        0.9112             nan     0.1000    0.0128
##      9        0.8846             nan     0.1000    0.0091
##     10        0.8576             nan     0.1000    0.0107
##     20        0.7034             nan     0.1000    0.0048
##     40        0.5497             nan     0.1000    0.0007
##     60        0.4685             nan     0.1000   -0.0003
##     80        0.4007             nan     0.1000   -0.0011
##    100        0.3477             nan     0.1000   -0.0013
##    120        0.3014             nan     0.1000   -0.0003
##    140        0.2625             nan     0.1000   -0.0006
##    160        0.2328             nan     0.1000   -0.0005
##    180        0.2099             nan     0.1000   -0.0006
##    200        0.1850             nan     0.1000   -0.0009
##    220        0.1627             nan     0.1000   -0.0005
##    240        0.1452             nan     0.1000   -0.0002
##    260        0.1309             nan     0.1000   -0.0004
##    280        0.1170             nan     0.1000   -0.0004
##    300        0.1048             nan     0.1000   -0.0006
##    320        0.0939             nan     0.1000   -0.0004
##    340        0.0850             nan     0.1000   -0.0002
##    360        0.0764             nan     0.1000   -0.0002
##    380        0.0679             nan     0.1000   -0.0001
##    400        0.0606             nan     0.1000   -0.0003
##    420        0.0550             nan     0.1000   -0.0001
##    440        0.0500             nan     0.1000   -0.0002
##    460        0.0453             nan     0.1000   -0.0000
##    480        0.0405             nan     0.1000   -0.0002
##    500        0.0364             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2369             nan     0.1000    0.0361
##      2        1.1540             nan     0.1000    0.0367
##      3        1.0851             nan     0.1000    0.0297
##      4        1.0349             nan     0.1000    0.0189
##      5        0.9886             nan     0.1000    0.0174
##      6        0.9506             nan     0.1000    0.0141
##      7        0.9125             nan     0.1000    0.0150
##      8        0.8775             nan     0.1000    0.0145
##      9        0.8477             nan     0.1000    0.0103
##     10        0.8224             nan     0.1000    0.0079
##     20        0.6541             nan     0.1000   -0.0011
##     40        0.4871             nan     0.1000   -0.0006
##     60        0.3957             nan     0.1000   -0.0014
##     80        0.3381             nan     0.1000   -0.0013
##    100        0.2759             nan     0.1000   -0.0001
##    120        0.2369             nan     0.1000   -0.0009
##    140        0.2025             nan     0.1000   -0.0003
##    160        0.1712             nan     0.1000   -0.0001
##    180        0.1478             nan     0.1000   -0.0004
##    200        0.1285             nan     0.1000   -0.0004
##    220        0.1099             nan     0.1000   -0.0002
##    240        0.0949             nan     0.1000   -0.0003
##    260        0.0823             nan     0.1000   -0.0004
##    280        0.0718             nan     0.1000   -0.0001
##    300        0.0634             nan     0.1000   -0.0002
##    320        0.0549             nan     0.1000   -0.0003
##    340        0.0485             nan     0.1000   -0.0000
##    360        0.0421             nan     0.1000   -0.0001
##    380        0.0374             nan     0.1000   -0.0001
##    400        0.0332             nan     0.1000   -0.0000
##    420        0.0294             nan     0.1000   -0.0000
##    440        0.0259             nan     0.1000   -0.0001
##    460        0.0229             nan     0.1000   -0.0001
##    480        0.0202             nan     0.1000   -0.0000
##    500        0.0180             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2306             nan     0.1000    0.0378
##      2        1.1544             nan     0.1000    0.0309
##      3        1.0967             nan     0.1000    0.0214
##      4        1.0418             nan     0.1000    0.0235
##      5        0.9973             nan     0.1000    0.0172
##      6        0.9544             nan     0.1000    0.0162
##      7        0.9188             nan     0.1000    0.0141
##      8        0.8855             nan     0.1000    0.0121
##      9        0.8542             nan     0.1000    0.0111
##     10        0.8302             nan     0.1000    0.0092
##     20        0.6667             nan     0.1000    0.0027
##     40        0.5060             nan     0.1000   -0.0001
##     60        0.4009             nan     0.1000    0.0011
##     80        0.3305             nan     0.1000   -0.0001
##    100        0.2756             nan     0.1000   -0.0001
##    120        0.2333             nan     0.1000   -0.0005
##    140        0.1999             nan     0.1000   -0.0005
##    160        0.1697             nan     0.1000   -0.0001
##    180        0.1449             nan     0.1000   -0.0005
##    200        0.1228             nan     0.1000   -0.0003
##    220        0.1062             nan     0.1000   -0.0003
##    240        0.0917             nan     0.1000   -0.0002
##    260        0.0802             nan     0.1000   -0.0004
##    280        0.0704             nan     0.1000   -0.0003
##    300        0.0621             nan     0.1000   -0.0002
##    320        0.0538             nan     0.1000   -0.0002
##    340        0.0471             nan     0.1000   -0.0000
##    360        0.0413             nan     0.1000   -0.0001
##    380        0.0361             nan     0.1000   -0.0002
##    400        0.0317             nan     0.1000   -0.0000
##    420        0.0281             nan     0.1000   -0.0001
##    440        0.0245             nan     0.1000   -0.0000
##    460        0.0217             nan     0.1000   -0.0000
##    480        0.0195             nan     0.1000   -0.0001
##    500        0.0173             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2359             nan     0.1000    0.0403
##      2        1.1658             nan     0.1000    0.0345
##      3        1.1071             nan     0.1000    0.0246
##      4        1.0520             nan     0.1000    0.0245
##      5        1.0014             nan     0.1000    0.0232
##      6        0.9603             nan     0.1000    0.0155
##      7        0.9261             nan     0.1000    0.0153
##      8        0.8930             nan     0.1000    0.0120
##      9        0.8614             nan     0.1000    0.0103
##     10        0.8401             nan     0.1000    0.0065
##     20        0.6902             nan     0.1000    0.0011
##     40        0.5086             nan     0.1000    0.0016
##     60        0.4168             nan     0.1000   -0.0002
##     80        0.3509             nan     0.1000   -0.0004
##    100        0.2943             nan     0.1000   -0.0005
##    120        0.2485             nan     0.1000    0.0003
##    140        0.2117             nan     0.1000   -0.0020
##    160        0.1805             nan     0.1000   -0.0006
##    180        0.1541             nan     0.1000   -0.0006
##    200        0.1340             nan     0.1000   -0.0007
##    220        0.1158             nan     0.1000   -0.0002
##    240        0.1021             nan     0.1000   -0.0003
##    260        0.0898             nan     0.1000   -0.0002
##    280        0.0790             nan     0.1000   -0.0007
##    300        0.0686             nan     0.1000   -0.0003
##    320        0.0608             nan     0.1000   -0.0000
##    340        0.0539             nan     0.1000   -0.0004
##    360        0.0472             nan     0.1000   -0.0003
##    380        0.0417             nan     0.1000   -0.0002
##    400        0.0368             nan     0.1000   -0.0001
##    420        0.0326             nan     0.1000   -0.0001
##    440        0.0288             nan     0.1000   -0.0001
##    460        0.0253             nan     0.1000   -0.0001
##    480        0.0221             nan     0.1000   -0.0000
##    500        0.0200             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0003
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2869             nan     0.0010    0.0004
##     60        1.2709             nan     0.0010    0.0004
##     80        1.2550             nan     0.0010    0.0004
##    100        1.2403             nan     0.0010    0.0003
##    120        1.2257             nan     0.0010    0.0003
##    140        1.2118             nan     0.0010    0.0003
##    160        1.1979             nan     0.0010    0.0003
##    180        1.1846             nan     0.0010    0.0003
##    200        1.1713             nan     0.0010    0.0003
##    220        1.1588             nan     0.0010    0.0002
##    240        1.1466             nan     0.0010    0.0003
##    260        1.1347             nan     0.0010    0.0003
##    280        1.1232             nan     0.0010    0.0003
##    300        1.1118             nan     0.0010    0.0002
##    320        1.1007             nan     0.0010    0.0002
##    340        1.0900             nan     0.0010    0.0002
##    360        1.0800             nan     0.0010    0.0002
##    380        1.0699             nan     0.0010    0.0002
##    400        1.0603             nan     0.0010    0.0002
##    420        1.0507             nan     0.0010    0.0002
##    440        1.0415             nan     0.0010    0.0002
##    460        1.0326             nan     0.0010    0.0002
##    480        1.0239             nan     0.0010    0.0002
##    500        1.0154             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3037             nan     0.0010    0.0004
##     40        1.2875             nan     0.0010    0.0004
##     60        1.2713             nan     0.0010    0.0003
##     80        1.2556             nan     0.0010    0.0004
##    100        1.2406             nan     0.0010    0.0003
##    120        1.2262             nan     0.0010    0.0003
##    140        1.2121             nan     0.0010    0.0003
##    160        1.1984             nan     0.0010    0.0003
##    180        1.1852             nan     0.0010    0.0003
##    200        1.1722             nan     0.0010    0.0003
##    220        1.1594             nan     0.0010    0.0003
##    240        1.1472             nan     0.0010    0.0003
##    260        1.1355             nan     0.0010    0.0003
##    280        1.1241             nan     0.0010    0.0002
##    300        1.1128             nan     0.0010    0.0003
##    320        1.1020             nan     0.0010    0.0002
##    340        1.0915             nan     0.0010    0.0003
##    360        1.0812             nan     0.0010    0.0002
##    380        1.0713             nan     0.0010    0.0002
##    400        1.0613             nan     0.0010    0.0002
##    420        1.0518             nan     0.0010    0.0002
##    440        1.0425             nan     0.0010    0.0002
##    460        1.0335             nan     0.0010    0.0002
##    480        1.0249             nan     0.0010    0.0002
##    500        1.0164             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0003
##     20        1.3038             nan     0.0010    0.0003
##     40        1.2872             nan     0.0010    0.0004
##     60        1.2714             nan     0.0010    0.0003
##     80        1.2560             nan     0.0010    0.0003
##    100        1.2414             nan     0.0010    0.0004
##    120        1.2269             nan     0.0010    0.0003
##    140        1.2131             nan     0.0010    0.0003
##    160        1.1995             nan     0.0010    0.0003
##    180        1.1860             nan     0.0010    0.0003
##    200        1.1732             nan     0.0010    0.0003
##    220        1.1606             nan     0.0010    0.0003
##    240        1.1483             nan     0.0010    0.0003
##    260        1.1364             nan     0.0010    0.0003
##    280        1.1250             nan     0.0010    0.0003
##    300        1.1138             nan     0.0010    0.0002
##    320        1.1030             nan     0.0010    0.0002
##    340        1.0926             nan     0.0010    0.0002
##    360        1.0825             nan     0.0010    0.0002
##    380        1.0726             nan     0.0010    0.0002
##    400        1.0628             nan     0.0010    0.0002
##    420        1.0533             nan     0.0010    0.0002
##    440        1.0442             nan     0.0010    0.0002
##    460        1.0351             nan     0.0010    0.0002
##    480        1.0264             nan     0.0010    0.0002
##    500        1.0178             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0005
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2847             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0003
##     80        1.2510             nan     0.0010    0.0003
##    100        1.2347             nan     0.0010    0.0003
##    120        1.2189             nan     0.0010    0.0004
##    140        1.2037             nan     0.0010    0.0003
##    160        1.1891             nan     0.0010    0.0003
##    180        1.1749             nan     0.0010    0.0003
##    200        1.1609             nan     0.0010    0.0003
##    220        1.1477             nan     0.0010    0.0003
##    240        1.1348             nan     0.0010    0.0003
##    260        1.1222             nan     0.0010    0.0003
##    280        1.1101             nan     0.0010    0.0003
##    300        1.0982             nan     0.0010    0.0002
##    320        1.0868             nan     0.0010    0.0002
##    340        1.0755             nan     0.0010    0.0003
##    360        1.0643             nan     0.0010    0.0002
##    380        1.0536             nan     0.0010    0.0002
##    400        1.0432             nan     0.0010    0.0002
##    420        1.0334             nan     0.0010    0.0002
##    440        1.0237             nan     0.0010    0.0002
##    460        1.0140             nan     0.0010    0.0002
##    480        1.0049             nan     0.0010    0.0002
##    500        0.9957             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3133             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0005
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3024             nan     0.0010    0.0004
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2675             nan     0.0010    0.0004
##     80        1.2511             nan     0.0010    0.0004
##    100        1.2353             nan     0.0010    0.0004
##    120        1.2197             nan     0.0010    0.0003
##    140        1.2047             nan     0.0010    0.0003
##    160        1.1902             nan     0.0010    0.0003
##    180        1.1761             nan     0.0010    0.0003
##    200        1.1624             nan     0.0010    0.0003
##    220        1.1490             nan     0.0010    0.0003
##    240        1.1360             nan     0.0010    0.0003
##    260        1.1235             nan     0.0010    0.0002
##    280        1.1115             nan     0.0010    0.0003
##    300        1.0996             nan     0.0010    0.0003
##    320        1.0882             nan     0.0010    0.0002
##    340        1.0773             nan     0.0010    0.0002
##    360        1.0663             nan     0.0010    0.0002
##    380        1.0561             nan     0.0010    0.0002
##    400        1.0455             nan     0.0010    0.0002
##    420        1.0353             nan     0.0010    0.0002
##    440        1.0256             nan     0.0010    0.0002
##    460        1.0160             nan     0.0010    0.0002
##    480        1.0068             nan     0.0010    0.0002
##    500        0.9976             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0005
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3025             nan     0.0010    0.0004
##     40        1.2853             nan     0.0010    0.0004
##     60        1.2682             nan     0.0010    0.0004
##     80        1.2518             nan     0.0010    0.0003
##    100        1.2358             nan     0.0010    0.0004
##    120        1.2204             nan     0.0010    0.0004
##    140        1.2053             nan     0.0010    0.0003
##    160        1.1908             nan     0.0010    0.0003
##    180        1.1766             nan     0.0010    0.0003
##    200        1.1631             nan     0.0010    0.0003
##    220        1.1499             nan     0.0010    0.0003
##    240        1.1371             nan     0.0010    0.0003
##    260        1.1246             nan     0.0010    0.0003
##    280        1.1127             nan     0.0010    0.0003
##    300        1.1011             nan     0.0010    0.0002
##    320        1.0896             nan     0.0010    0.0003
##    340        1.0787             nan     0.0010    0.0002
##    360        1.0680             nan     0.0010    0.0002
##    380        1.0577             nan     0.0010    0.0002
##    400        1.0476             nan     0.0010    0.0002
##    420        1.0376             nan     0.0010    0.0002
##    440        1.0279             nan     0.0010    0.0001
##    460        1.0183             nan     0.0010    0.0002
##    480        1.0090             nan     0.0010    0.0002
##    500        1.0002             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0005
##     20        1.3015             nan     0.0010    0.0004
##     40        1.2829             nan     0.0010    0.0004
##     60        1.2645             nan     0.0010    0.0004
##     80        1.2468             nan     0.0010    0.0004
##    100        1.2299             nan     0.0010    0.0004
##    120        1.2132             nan     0.0010    0.0004
##    140        1.1977             nan     0.0010    0.0003
##    160        1.1825             nan     0.0010    0.0003
##    180        1.1679             nan     0.0010    0.0003
##    200        1.1535             nan     0.0010    0.0003
##    220        1.1395             nan     0.0010    0.0003
##    240        1.1262             nan     0.0010    0.0003
##    260        1.1128             nan     0.0010    0.0003
##    280        1.1000             nan     0.0010    0.0003
##    300        1.0876             nan     0.0010    0.0003
##    320        1.0757             nan     0.0010    0.0003
##    340        1.0641             nan     0.0010    0.0002
##    360        1.0526             nan     0.0010    0.0003
##    380        1.0416             nan     0.0010    0.0002
##    400        1.0308             nan     0.0010    0.0002
##    420        1.0203             nan     0.0010    0.0002
##    440        1.0101             nan     0.0010    0.0002
##    460        1.0000             nan     0.0010    0.0002
##    480        0.9901             nan     0.0010    0.0002
##    500        0.9808             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0005
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0004
##     10        1.3110             nan     0.0010    0.0005
##     20        1.3013             nan     0.0010    0.0004
##     40        1.2828             nan     0.0010    0.0004
##     60        1.2650             nan     0.0010    0.0004
##     80        1.2476             nan     0.0010    0.0004
##    100        1.2311             nan     0.0010    0.0004
##    120        1.2150             nan     0.0010    0.0004
##    140        1.1992             nan     0.0010    0.0003
##    160        1.1839             nan     0.0010    0.0003
##    180        1.1692             nan     0.0010    0.0003
##    200        1.1545             nan     0.0010    0.0003
##    220        1.1408             nan     0.0010    0.0003
##    240        1.1274             nan     0.0010    0.0003
##    260        1.1144             nan     0.0010    0.0003
##    280        1.1015             nan     0.0010    0.0003
##    300        1.0894             nan     0.0010    0.0003
##    320        1.0773             nan     0.0010    0.0003
##    340        1.0658             nan     0.0010    0.0003
##    360        1.0545             nan     0.0010    0.0002
##    380        1.0436             nan     0.0010    0.0002
##    400        1.0331             nan     0.0010    0.0002
##    420        1.0227             nan     0.0010    0.0002
##    440        1.0127             nan     0.0010    0.0002
##    460        1.0028             nan     0.0010    0.0002
##    480        0.9931             nan     0.0010    0.0002
##    500        0.9838             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0004
##      7        1.3138             nan     0.0010    0.0005
##      8        1.3128             nan     0.0010    0.0004
##      9        1.3119             nan     0.0010    0.0004
##     10        1.3108             nan     0.0010    0.0004
##     20        1.3015             nan     0.0010    0.0004
##     40        1.2833             nan     0.0010    0.0005
##     60        1.2654             nan     0.0010    0.0004
##     80        1.2485             nan     0.0010    0.0003
##    100        1.2323             nan     0.0010    0.0003
##    120        1.2165             nan     0.0010    0.0003
##    140        1.2008             nan     0.0010    0.0003
##    160        1.1856             nan     0.0010    0.0003
##    180        1.1710             nan     0.0010    0.0003
##    200        1.1569             nan     0.0010    0.0003
##    220        1.1432             nan     0.0010    0.0003
##    240        1.1300             nan     0.0010    0.0003
##    260        1.1169             nan     0.0010    0.0003
##    280        1.1044             nan     0.0010    0.0003
##    300        1.0919             nan     0.0010    0.0003
##    320        1.0800             nan     0.0010    0.0003
##    340        1.0686             nan     0.0010    0.0003
##    360        1.0575             nan     0.0010    0.0002
##    380        1.0467             nan     0.0010    0.0002
##    400        1.0361             nan     0.0010    0.0002
##    420        1.0259             nan     0.0010    0.0002
##    440        1.0159             nan     0.0010    0.0002
##    460        1.0061             nan     0.0010    0.0002
##    480        0.9966             nan     0.0010    0.0002
##    500        0.9874             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0041
##      2        1.3040             nan     0.0100    0.0038
##      3        1.2960             nan     0.0100    0.0042
##      4        1.2877             nan     0.0100    0.0037
##      5        1.2799             nan     0.0100    0.0032
##      6        1.2716             nan     0.0100    0.0038
##      7        1.2630             nan     0.0100    0.0037
##      8        1.2550             nan     0.0100    0.0038
##      9        1.2470             nan     0.0100    0.0035
##     10        1.2392             nan     0.0100    0.0037
##     20        1.1720             nan     0.0100    0.0030
##     40        1.0611             nan     0.0100    0.0021
##     60        0.9745             nan     0.0100    0.0015
##     80        0.9067             nan     0.0100    0.0012
##    100        0.8507             nan     0.0100    0.0010
##    120        0.8060             nan     0.0100    0.0008
##    140        0.7694             nan     0.0100    0.0005
##    160        0.7387             nan     0.0100    0.0005
##    180        0.7128             nan     0.0100    0.0003
##    200        0.6898             nan     0.0100    0.0002
##    220        0.6696             nan     0.0100    0.0000
##    240        0.6510             nan     0.0100    0.0001
##    260        0.6348             nan     0.0100    0.0001
##    280        0.6196             nan     0.0100   -0.0001
##    300        0.6061             nan     0.0100    0.0001
##    320        0.5931             nan     0.0100    0.0001
##    340        0.5814             nan     0.0100   -0.0000
##    360        0.5700             nan     0.0100   -0.0002
##    380        0.5594             nan     0.0100    0.0001
##    400        0.5492             nan     0.0100    0.0000
##    420        0.5398             nan     0.0100   -0.0001
##    440        0.5310             nan     0.0100   -0.0000
##    460        0.5224             nan     0.0100    0.0000
##    480        0.5138             nan     0.0100   -0.0001
##    500        0.5058             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0042
##      2        1.3019             nan     0.0100    0.0041
##      3        1.2931             nan     0.0100    0.0038
##      4        1.2850             nan     0.0100    0.0035
##      5        1.2768             nan     0.0100    0.0036
##      6        1.2694             nan     0.0100    0.0033
##      7        1.2608             nan     0.0100    0.0040
##      8        1.2531             nan     0.0100    0.0037
##      9        1.2450             nan     0.0100    0.0037
##     10        1.2382             nan     0.0100    0.0032
##     20        1.1716             nan     0.0100    0.0026
##     40        1.0607             nan     0.0100    0.0021
##     60        0.9749             nan     0.0100    0.0017
##     80        0.9080             nan     0.0100    0.0011
##    100        0.8561             nan     0.0100    0.0010
##    120        0.8112             nan     0.0100    0.0008
##    140        0.7759             nan     0.0100    0.0004
##    160        0.7450             nan     0.0100    0.0002
##    180        0.7181             nan     0.0100    0.0003
##    200        0.6964             nan     0.0100    0.0004
##    220        0.6759             nan     0.0100    0.0001
##    240        0.6582             nan     0.0100    0.0001
##    260        0.6423             nan     0.0100   -0.0001
##    280        0.6277             nan     0.0100    0.0000
##    300        0.6136             nan     0.0100    0.0001
##    320        0.6017             nan     0.0100    0.0002
##    340        0.5903             nan     0.0100   -0.0002
##    360        0.5792             nan     0.0100    0.0000
##    380        0.5690             nan     0.0100   -0.0000
##    400        0.5589             nan     0.0100   -0.0001
##    420        0.5494             nan     0.0100    0.0000
##    440        0.5412             nan     0.0100    0.0000
##    460        0.5330             nan     0.0100   -0.0000
##    480        0.5256             nan     0.0100    0.0000
##    500        0.5170             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0039
##      2        1.3037             nan     0.0100    0.0036
##      3        1.2955             nan     0.0100    0.0035
##      4        1.2874             nan     0.0100    0.0039
##      5        1.2790             nan     0.0100    0.0037
##      6        1.2715             nan     0.0100    0.0036
##      7        1.2640             nan     0.0100    0.0034
##      8        1.2561             nan     0.0100    0.0034
##      9        1.2492             nan     0.0100    0.0030
##     10        1.2412             nan     0.0100    0.0038
##     20        1.1725             nan     0.0100    0.0025
##     40        1.0642             nan     0.0100    0.0023
##     60        0.9782             nan     0.0100    0.0017
##     80        0.9121             nan     0.0100    0.0010
##    100        0.8589             nan     0.0100    0.0011
##    120        0.8147             nan     0.0100    0.0005
##    140        0.7771             nan     0.0100    0.0007
##    160        0.7462             nan     0.0100    0.0005
##    180        0.7204             nan     0.0100    0.0001
##    200        0.6993             nan     0.0100    0.0001
##    220        0.6790             nan     0.0100    0.0002
##    240        0.6614             nan     0.0100    0.0001
##    260        0.6459             nan     0.0100    0.0001
##    280        0.6317             nan     0.0100    0.0001
##    300        0.6188             nan     0.0100    0.0001
##    320        0.6070             nan     0.0100    0.0000
##    340        0.5954             nan     0.0100    0.0001
##    360        0.5842             nan     0.0100    0.0000
##    380        0.5742             nan     0.0100   -0.0001
##    400        0.5648             nan     0.0100   -0.0000
##    420        0.5559             nan     0.0100   -0.0002
##    440        0.5474             nan     0.0100   -0.0002
##    460        0.5382             nan     0.0100   -0.0000
##    480        0.5312             nan     0.0100   -0.0000
##    500        0.5230             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3109             nan     0.0100    0.0045
##      2        1.3011             nan     0.0100    0.0045
##      3        1.2920             nan     0.0100    0.0041
##      4        1.2835             nan     0.0100    0.0037
##      5        1.2758             nan     0.0100    0.0035
##      6        1.2668             nan     0.0100    0.0037
##      7        1.2582             nan     0.0100    0.0038
##      8        1.2506             nan     0.0100    0.0038
##      9        1.2418             nan     0.0100    0.0042
##     10        1.2338             nan     0.0100    0.0034
##     20        1.1611             nan     0.0100    0.0029
##     40        1.0431             nan     0.0100    0.0022
##     60        0.9529             nan     0.0100    0.0017
##     80        0.8817             nan     0.0100    0.0012
##    100        0.8263             nan     0.0100    0.0009
##    120        0.7801             nan     0.0100    0.0008
##    140        0.7410             nan     0.0100    0.0006
##    160        0.7087             nan     0.0100    0.0005
##    180        0.6812             nan     0.0100    0.0003
##    200        0.6576             nan     0.0100    0.0001
##    220        0.6365             nan     0.0100    0.0002
##    240        0.6168             nan     0.0100    0.0000
##    260        0.5990             nan     0.0100    0.0000
##    280        0.5826             nan     0.0100    0.0000
##    300        0.5678             nan     0.0100    0.0001
##    320        0.5549             nan     0.0100   -0.0000
##    340        0.5432             nan     0.0100   -0.0000
##    360        0.5307             nan     0.0100   -0.0001
##    380        0.5198             nan     0.0100    0.0000
##    400        0.5084             nan     0.0100   -0.0000
##    420        0.4975             nan     0.0100   -0.0000
##    440        0.4874             nan     0.0100    0.0001
##    460        0.4782             nan     0.0100    0.0000
##    480        0.4695             nan     0.0100    0.0000
##    500        0.4603             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0038
##      2        1.3031             nan     0.0100    0.0041
##      3        1.2941             nan     0.0100    0.0042
##      4        1.2859             nan     0.0100    0.0041
##      5        1.2767             nan     0.0100    0.0042
##      6        1.2685             nan     0.0100    0.0040
##      7        1.2606             nan     0.0100    0.0037
##      8        1.2520             nan     0.0100    0.0040
##      9        1.2433             nan     0.0100    0.0040
##     10        1.2356             nan     0.0100    0.0033
##     20        1.1621             nan     0.0100    0.0027
##     40        1.0441             nan     0.0100    0.0021
##     60        0.9532             nan     0.0100    0.0018
##     80        0.8826             nan     0.0100    0.0012
##    100        0.8248             nan     0.0100    0.0010
##    120        0.7784             nan     0.0100    0.0009
##    140        0.7399             nan     0.0100    0.0007
##    160        0.7076             nan     0.0100    0.0004
##    180        0.6807             nan     0.0100    0.0003
##    200        0.6559             nan     0.0100    0.0002
##    220        0.6343             nan     0.0100    0.0001
##    240        0.6163             nan     0.0100    0.0002
##    260        0.5987             nan     0.0100    0.0001
##    280        0.5835             nan     0.0100   -0.0002
##    300        0.5695             nan     0.0100    0.0001
##    320        0.5561             nan     0.0100    0.0001
##    340        0.5444             nan     0.0100    0.0001
##    360        0.5330             nan     0.0100   -0.0000
##    380        0.5222             nan     0.0100   -0.0001
##    400        0.5115             nan     0.0100   -0.0002
##    420        0.5009             nan     0.0100   -0.0000
##    440        0.4915             nan     0.0100    0.0000
##    460        0.4817             nan     0.0100   -0.0002
##    480        0.4724             nan     0.0100    0.0000
##    500        0.4627             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0041
##      2        1.3017             nan     0.0100    0.0044
##      3        1.2928             nan     0.0100    0.0043
##      4        1.2845             nan     0.0100    0.0037
##      5        1.2760             nan     0.0100    0.0040
##      6        1.2675             nan     0.0100    0.0041
##      7        1.2593             nan     0.0100    0.0037
##      8        1.2509             nan     0.0100    0.0038
##      9        1.2429             nan     0.0100    0.0034
##     10        1.2351             nan     0.0100    0.0035
##     20        1.1632             nan     0.0100    0.0030
##     40        1.0469             nan     0.0100    0.0022
##     60        0.9573             nan     0.0100    0.0018
##     80        0.8877             nan     0.0100    0.0012
##    100        0.8332             nan     0.0100    0.0007
##    120        0.7882             nan     0.0100    0.0009
##    140        0.7500             nan     0.0100    0.0004
##    160        0.7186             nan     0.0100    0.0003
##    180        0.6914             nan     0.0100    0.0002
##    200        0.6666             nan     0.0100    0.0004
##    220        0.6452             nan     0.0100    0.0002
##    240        0.6271             nan     0.0100   -0.0000
##    260        0.6106             nan     0.0100   -0.0000
##    280        0.5955             nan     0.0100    0.0000
##    300        0.5818             nan     0.0100    0.0001
##    320        0.5674             nan     0.0100    0.0001
##    340        0.5550             nan     0.0100    0.0000
##    360        0.5434             nan     0.0100   -0.0001
##    380        0.5321             nan     0.0100    0.0000
##    400        0.5214             nan     0.0100   -0.0001
##    420        0.5121             nan     0.0100   -0.0002
##    440        0.5025             nan     0.0100   -0.0001
##    460        0.4926             nan     0.0100   -0.0001
##    480        0.4844             nan     0.0100    0.0001
##    500        0.4752             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0039
##      2        1.3010             nan     0.0100    0.0048
##      3        1.2919             nan     0.0100    0.0043
##      4        1.2829             nan     0.0100    0.0041
##      5        1.2732             nan     0.0100    0.0044
##      6        1.2640             nan     0.0100    0.0041
##      7        1.2556             nan     0.0100    0.0039
##      8        1.2467             nan     0.0100    0.0041
##      9        1.2379             nan     0.0100    0.0036
##     10        1.2301             nan     0.0100    0.0035
##     20        1.1519             nan     0.0100    0.0030
##     40        1.0295             nan     0.0100    0.0024
##     60        0.9356             nan     0.0100    0.0015
##     80        0.8625             nan     0.0100    0.0012
##    100        0.8019             nan     0.0100    0.0011
##    120        0.7514             nan     0.0100    0.0006
##    140        0.7116             nan     0.0100    0.0006
##    160        0.6768             nan     0.0100    0.0003
##    180        0.6479             nan     0.0100    0.0004
##    200        0.6224             nan     0.0100    0.0001
##    220        0.5992             nan     0.0100    0.0002
##    240        0.5800             nan     0.0100    0.0001
##    260        0.5624             nan     0.0100    0.0000
##    280        0.5446             nan     0.0100    0.0002
##    300        0.5280             nan     0.0100    0.0001
##    320        0.5129             nan     0.0100    0.0002
##    340        0.4998             nan     0.0100   -0.0001
##    360        0.4874             nan     0.0100   -0.0001
##    380        0.4749             nan     0.0100   -0.0001
##    400        0.4636             nan     0.0100    0.0000
##    420        0.4527             nan     0.0100    0.0001
##    440        0.4406             nan     0.0100   -0.0000
##    460        0.4310             nan     0.0100   -0.0001
##    480        0.4200             nan     0.0100    0.0000
##    500        0.4102             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3107             nan     0.0100    0.0045
##      2        1.3022             nan     0.0100    0.0037
##      3        1.2924             nan     0.0100    0.0041
##      4        1.2835             nan     0.0100    0.0036
##      5        1.2743             nan     0.0100    0.0041
##      6        1.2654             nan     0.0100    0.0039
##      7        1.2571             nan     0.0100    0.0039
##      8        1.2486             nan     0.0100    0.0040
##      9        1.2407             nan     0.0100    0.0035
##     10        1.2321             nan     0.0100    0.0042
##     20        1.1569             nan     0.0100    0.0031
##     40        1.0343             nan     0.0100    0.0024
##     60        0.9400             nan     0.0100    0.0017
##     80        0.8684             nan     0.0100    0.0012
##    100        0.8104             nan     0.0100    0.0009
##    120        0.7619             nan     0.0100    0.0009
##    140        0.7211             nan     0.0100    0.0004
##    160        0.6863             nan     0.0100    0.0006
##    180        0.6579             nan     0.0100    0.0004
##    200        0.6335             nan     0.0100    0.0005
##    220        0.6112             nan     0.0100    0.0000
##    240        0.5914             nan     0.0100    0.0001
##    260        0.5723             nan     0.0100   -0.0001
##    280        0.5559             nan     0.0100    0.0000
##    300        0.5411             nan     0.0100    0.0001
##    320        0.5277             nan     0.0100   -0.0001
##    340        0.5134             nan     0.0100    0.0002
##    360        0.5003             nan     0.0100    0.0001
##    380        0.4878             nan     0.0100   -0.0000
##    400        0.4761             nan     0.0100   -0.0001
##    420        0.4648             nan     0.0100   -0.0000
##    440        0.4541             nan     0.0100    0.0000
##    460        0.4437             nan     0.0100   -0.0000
##    480        0.4339             nan     0.0100   -0.0002
##    500        0.4233             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3109             nan     0.0100    0.0042
##      2        1.3012             nan     0.0100    0.0042
##      3        1.2916             nan     0.0100    0.0044
##      4        1.2821             nan     0.0100    0.0044
##      5        1.2737             nan     0.0100    0.0039
##      6        1.2651             nan     0.0100    0.0037
##      7        1.2561             nan     0.0100    0.0042
##      8        1.2475             nan     0.0100    0.0040
##      9        1.2398             nan     0.0100    0.0031
##     10        1.2314             nan     0.0100    0.0037
##     20        1.1553             nan     0.0100    0.0033
##     40        1.0331             nan     0.0100    0.0021
##     60        0.9403             nan     0.0100    0.0015
##     80        0.8679             nan     0.0100    0.0012
##    100        0.8103             nan     0.0100    0.0014
##    120        0.7644             nan     0.0100    0.0007
##    140        0.7260             nan     0.0100    0.0007
##    160        0.6930             nan     0.0100    0.0004
##    180        0.6654             nan     0.0100    0.0003
##    200        0.6408             nan     0.0100    0.0003
##    220        0.6188             nan     0.0100    0.0001
##    240        0.5986             nan     0.0100    0.0002
##    260        0.5809             nan     0.0100    0.0001
##    280        0.5645             nan     0.0100    0.0000
##    300        0.5497             nan     0.0100    0.0000
##    320        0.5347             nan     0.0100   -0.0000
##    340        0.5215             nan     0.0100    0.0000
##    360        0.5085             nan     0.0100   -0.0000
##    380        0.4955             nan     0.0100   -0.0000
##    400        0.4841             nan     0.0100    0.0001
##    420        0.4739             nan     0.0100    0.0000
##    440        0.4634             nan     0.0100   -0.0001
##    460        0.4534             nan     0.0100    0.0001
##    480        0.4435             nan     0.0100   -0.0001
##    500        0.4338             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2312             nan     0.1000    0.0359
##      2        1.1616             nan     0.1000    0.0330
##      3        1.1025             nan     0.1000    0.0257
##      4        1.0514             nan     0.1000    0.0209
##      5        1.0030             nan     0.1000    0.0187
##      6        0.9638             nan     0.1000    0.0154
##      7        0.9298             nan     0.1000    0.0110
##      8        0.9016             nan     0.1000    0.0104
##      9        0.8767             nan     0.1000    0.0104
##     10        0.8546             nan     0.1000    0.0083
##     20        0.6865             nan     0.1000    0.0038
##     40        0.5493             nan     0.1000   -0.0001
##     60        0.4713             nan     0.1000   -0.0011
##     80        0.4040             nan     0.1000   -0.0010
##    100        0.3571             nan     0.1000    0.0001
##    120        0.3154             nan     0.1000   -0.0002
##    140        0.2807             nan     0.1000   -0.0008
##    160        0.2489             nan     0.1000   -0.0004
##    180        0.2225             nan     0.1000   -0.0010
##    200        0.2002             nan     0.1000   -0.0002
##    220        0.1824             nan     0.1000   -0.0008
##    240        0.1638             nan     0.1000   -0.0003
##    260        0.1506             nan     0.1000   -0.0003
##    280        0.1381             nan     0.1000   -0.0003
##    300        0.1271             nan     0.1000   -0.0004
##    320        0.1159             nan     0.1000   -0.0001
##    340        0.1058             nan     0.1000   -0.0003
##    360        0.0964             nan     0.1000   -0.0001
##    380        0.0891             nan     0.1000   -0.0002
##    400        0.0819             nan     0.1000   -0.0002
##    420        0.0750             nan     0.1000   -0.0003
##    440        0.0690             nan     0.1000    0.0000
##    460        0.0642             nan     0.1000   -0.0002
##    480        0.0586             nan     0.1000   -0.0000
##    500        0.0540             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2342             nan     0.1000    0.0405
##      2        1.1632             nan     0.1000    0.0292
##      3        1.1057             nan     0.1000    0.0259
##      4        1.0539             nan     0.1000    0.0229
##      5        1.0083             nan     0.1000    0.0192
##      6        0.9683             nan     0.1000    0.0170
##      7        0.9354             nan     0.1000    0.0132
##      8        0.9041             nan     0.1000    0.0154
##      9        0.8761             nan     0.1000    0.0123
##     10        0.8519             nan     0.1000    0.0100
##     20        0.6943             nan     0.1000    0.0011
##     40        0.5614             nan     0.1000   -0.0005
##     60        0.4755             nan     0.1000   -0.0018
##     80        0.4244             nan     0.1000   -0.0008
##    100        0.3799             nan     0.1000   -0.0000
##    120        0.3345             nan     0.1000   -0.0005
##    140        0.3027             nan     0.1000   -0.0013
##    160        0.2761             nan     0.1000   -0.0000
##    180        0.2458             nan     0.1000   -0.0008
##    200        0.2218             nan     0.1000   -0.0006
##    220        0.1996             nan     0.1000   -0.0007
##    240        0.1822             nan     0.1000   -0.0009
##    260        0.1657             nan     0.1000   -0.0006
##    280        0.1493             nan     0.1000   -0.0006
##    300        0.1356             nan     0.1000   -0.0003
##    320        0.1239             nan     0.1000   -0.0002
##    340        0.1137             nan     0.1000   -0.0002
##    360        0.1027             nan     0.1000   -0.0002
##    380        0.0939             nan     0.1000   -0.0001
##    400        0.0866             nan     0.1000   -0.0006
##    420        0.0789             nan     0.1000    0.0000
##    440        0.0727             nan     0.1000   -0.0002
##    460        0.0669             nan     0.1000   -0.0001
##    480        0.0610             nan     0.1000   -0.0002
##    500        0.0561             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2344             nan     0.1000    0.0380
##      2        1.1674             nan     0.1000    0.0285
##      3        1.1020             nan     0.1000    0.0275
##      4        1.0506             nan     0.1000    0.0207
##      5        1.0083             nan     0.1000    0.0179
##      6        0.9731             nan     0.1000    0.0143
##      7        0.9379             nan     0.1000    0.0150
##      8        0.9058             nan     0.1000    0.0131
##      9        0.8767             nan     0.1000    0.0123
##     10        0.8506             nan     0.1000    0.0111
##     20        0.6959             nan     0.1000    0.0036
##     40        0.5707             nan     0.1000    0.0002
##     60        0.4848             nan     0.1000   -0.0022
##     80        0.4217             nan     0.1000    0.0001
##    100        0.3742             nan     0.1000   -0.0011
##    120        0.3345             nan     0.1000   -0.0008
##    140        0.2986             nan     0.1000   -0.0013
##    160        0.2694             nan     0.1000   -0.0007
##    180        0.2411             nan     0.1000   -0.0008
##    200        0.2169             nan     0.1000   -0.0005
##    220        0.1989             nan     0.1000   -0.0007
##    240        0.1795             nan     0.1000   -0.0005
##    260        0.1645             nan     0.1000   -0.0008
##    280        0.1512             nan     0.1000   -0.0004
##    300        0.1394             nan     0.1000   -0.0004
##    320        0.1281             nan     0.1000   -0.0002
##    340        0.1164             nan     0.1000   -0.0003
##    360        0.1081             nan     0.1000   -0.0003
##    380        0.0993             nan     0.1000   -0.0002
##    400        0.0915             nan     0.1000   -0.0003
##    420        0.0849             nan     0.1000   -0.0004
##    440        0.0777             nan     0.1000   -0.0003
##    460        0.0710             nan     0.1000   -0.0003
##    480        0.0649             nan     0.1000   -0.0003
##    500        0.0601             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2299             nan     0.1000    0.0396
##      2        1.1525             nan     0.1000    0.0348
##      3        1.0850             nan     0.1000    0.0300
##      4        1.0360             nan     0.1000    0.0209
##      5        0.9886             nan     0.1000    0.0198
##      6        0.9493             nan     0.1000    0.0165
##      7        0.9070             nan     0.1000    0.0177
##      8        0.8741             nan     0.1000    0.0122
##      9        0.8447             nan     0.1000    0.0123
##     10        0.8192             nan     0.1000    0.0107
##     20        0.6511             nan     0.1000    0.0004
##     40        0.5095             nan     0.1000    0.0002
##     60        0.4149             nan     0.1000   -0.0012
##     80        0.3518             nan     0.1000   -0.0002
##    100        0.2978             nan     0.1000   -0.0001
##    120        0.2555             nan     0.1000   -0.0007
##    140        0.2213             nan     0.1000   -0.0004
##    160        0.1963             nan     0.1000   -0.0006
##    180        0.1716             nan     0.1000   -0.0005
##    200        0.1509             nan     0.1000   -0.0003
##    220        0.1337             nan     0.1000   -0.0005
##    240        0.1186             nan     0.1000   -0.0006
##    260        0.1052             nan     0.1000   -0.0003
##    280        0.0931             nan     0.1000   -0.0001
##    300        0.0844             nan     0.1000   -0.0004
##    320        0.0762             nan     0.1000   -0.0001
##    340        0.0689             nan     0.1000   -0.0002
##    360        0.0619             nan     0.1000    0.0000
##    380        0.0560             nan     0.1000   -0.0001
##    400        0.0504             nan     0.1000   -0.0002
##    420        0.0455             nan     0.1000   -0.0002
##    440        0.0410             nan     0.1000   -0.0000
##    460        0.0369             nan     0.1000   -0.0001
##    480        0.0330             nan     0.1000   -0.0001
##    500        0.0298             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2326             nan     0.1000    0.0396
##      2        1.1628             nan     0.1000    0.0312
##      3        1.0985             nan     0.1000    0.0313
##      4        1.0416             nan     0.1000    0.0242
##      5        0.9933             nan     0.1000    0.0188
##      6        0.9508             nan     0.1000    0.0144
##      7        0.9147             nan     0.1000    0.0153
##      8        0.8787             nan     0.1000    0.0119
##      9        0.8490             nan     0.1000    0.0124
##     10        0.8245             nan     0.1000    0.0092
##     20        0.6641             nan     0.1000    0.0010
##     40        0.5280             nan     0.1000    0.0003
##     60        0.4445             nan     0.1000   -0.0006
##     80        0.3750             nan     0.1000   -0.0005
##    100        0.3252             nan     0.1000   -0.0010
##    120        0.2829             nan     0.1000   -0.0014
##    140        0.2435             nan     0.1000   -0.0003
##    160        0.2132             nan     0.1000   -0.0007
##    180        0.1834             nan     0.1000    0.0002
##    200        0.1632             nan     0.1000   -0.0004
##    220        0.1431             nan     0.1000   -0.0005
##    240        0.1249             nan     0.1000    0.0001
##    260        0.1110             nan     0.1000   -0.0003
##    280        0.0978             nan     0.1000   -0.0000
##    300        0.0866             nan     0.1000   -0.0003
##    320        0.0767             nan     0.1000   -0.0004
##    340        0.0689             nan     0.1000   -0.0002
##    360        0.0620             nan     0.1000   -0.0001
##    380        0.0552             nan     0.1000   -0.0004
##    400        0.0492             nan     0.1000   -0.0001
##    420        0.0448             nan     0.1000   -0.0001
##    440        0.0406             nan     0.1000   -0.0002
##    460        0.0368             nan     0.1000   -0.0000
##    480        0.0330             nan     0.1000   -0.0001
##    500        0.0299             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2302             nan     0.1000    0.0405
##      2        1.1556             nan     0.1000    0.0322
##      3        1.1020             nan     0.1000    0.0221
##      4        1.0469             nan     0.1000    0.0226
##      5        0.9985             nan     0.1000    0.0214
##      6        0.9571             nan     0.1000    0.0186
##      7        0.9200             nan     0.1000    0.0162
##      8        0.8891             nan     0.1000    0.0132
##      9        0.8587             nan     0.1000    0.0127
##     10        0.8375             nan     0.1000    0.0068
##     20        0.6701             nan     0.1000    0.0011
##     40        0.5279             nan     0.1000   -0.0005
##     60        0.4385             nan     0.1000   -0.0003
##     80        0.3770             nan     0.1000   -0.0006
##    100        0.3217             nan     0.1000   -0.0009
##    120        0.2817             nan     0.1000   -0.0002
##    140        0.2435             nan     0.1000   -0.0008
##    160        0.2131             nan     0.1000   -0.0012
##    180        0.1872             nan     0.1000   -0.0006
##    200        0.1672             nan     0.1000   -0.0008
##    220        0.1491             nan     0.1000   -0.0001
##    240        0.1315             nan     0.1000   -0.0006
##    260        0.1182             nan     0.1000   -0.0002
##    280        0.1060             nan     0.1000   -0.0005
##    300        0.0950             nan     0.1000   -0.0004
##    320        0.0856             nan     0.1000   -0.0004
##    340        0.0752             nan     0.1000   -0.0004
##    360        0.0672             nan     0.1000   -0.0002
##    380        0.0599             nan     0.1000   -0.0002
##    400        0.0542             nan     0.1000   -0.0003
##    420        0.0479             nan     0.1000   -0.0002
##    440        0.0432             nan     0.1000   -0.0002
##    460        0.0395             nan     0.1000   -0.0001
##    480        0.0356             nan     0.1000   -0.0002
##    500        0.0316             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2220             nan     0.1000    0.0449
##      2        1.1456             nan     0.1000    0.0328
##      3        1.0769             nan     0.1000    0.0298
##      4        1.0228             nan     0.1000    0.0230
##      5        0.9729             nan     0.1000    0.0230
##      6        0.9325             nan     0.1000    0.0163
##      7        0.8926             nan     0.1000    0.0167
##      8        0.8598             nan     0.1000    0.0101
##      9        0.8331             nan     0.1000    0.0109
##     10        0.8079             nan     0.1000    0.0086
##     20        0.6202             nan     0.1000    0.0021
##     40        0.4685             nan     0.1000    0.0001
##     60        0.3791             nan     0.1000   -0.0002
##     80        0.3083             nan     0.1000   -0.0005
##    100        0.2637             nan     0.1000   -0.0003
##    120        0.2179             nan     0.1000   -0.0008
##    140        0.1807             nan     0.1000    0.0001
##    160        0.1542             nan     0.1000   -0.0004
##    180        0.1333             nan     0.1000   -0.0005
##    200        0.1135             nan     0.1000   -0.0004
##    220        0.0971             nan     0.1000   -0.0002
##    240        0.0845             nan     0.1000   -0.0001
##    260        0.0732             nan     0.1000   -0.0000
##    280        0.0635             nan     0.1000   -0.0001
##    300        0.0547             nan     0.1000   -0.0000
##    320        0.0485             nan     0.1000    0.0000
##    340        0.0431             nan     0.1000   -0.0001
##    360        0.0377             nan     0.1000   -0.0001
##    380        0.0332             nan     0.1000   -0.0001
##    400        0.0294             nan     0.1000   -0.0001
##    420        0.0262             nan     0.1000   -0.0001
##    440        0.0236             nan     0.1000   -0.0001
##    460        0.0208             nan     0.1000   -0.0001
##    480        0.0186             nan     0.1000   -0.0001
##    500        0.0163             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2176             nan     0.1000    0.0449
##      2        1.1470             nan     0.1000    0.0321
##      3        1.0800             nan     0.1000    0.0275
##      4        1.0209             nan     0.1000    0.0240
##      5        0.9776             nan     0.1000    0.0180
##      6        0.9315             nan     0.1000    0.0182
##      7        0.8931             nan     0.1000    0.0151
##      8        0.8607             nan     0.1000    0.0114
##      9        0.8308             nan     0.1000    0.0123
##     10        0.8094             nan     0.1000    0.0081
##     20        0.6460             nan     0.1000    0.0017
##     40        0.4961             nan     0.1000   -0.0005
##     60        0.4043             nan     0.1000   -0.0017
##     80        0.3311             nan     0.1000   -0.0006
##    100        0.2779             nan     0.1000   -0.0003
##    120        0.2326             nan     0.1000   -0.0004
##    140        0.2003             nan     0.1000   -0.0004
##    160        0.1700             nan     0.1000   -0.0003
##    180        0.1423             nan     0.1000   -0.0003
##    200        0.1245             nan     0.1000   -0.0002
##    220        0.1074             nan     0.1000   -0.0001
##    240        0.0913             nan     0.1000   -0.0003
##    260        0.0798             nan     0.1000   -0.0005
##    280        0.0696             nan     0.1000   -0.0001
##    300        0.0601             nan     0.1000   -0.0001
##    320        0.0531             nan     0.1000   -0.0003
##    340        0.0465             nan     0.1000   -0.0003
##    360        0.0407             nan     0.1000   -0.0000
##    380        0.0359             nan     0.1000   -0.0001
##    400        0.0311             nan     0.1000   -0.0000
##    420        0.0278             nan     0.1000   -0.0001
##    440        0.0246             nan     0.1000   -0.0001
##    460        0.0215             nan     0.1000   -0.0000
##    480        0.0191             nan     0.1000   -0.0001
##    500        0.0167             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2332             nan     0.1000    0.0394
##      2        1.1624             nan     0.1000    0.0359
##      3        1.0991             nan     0.1000    0.0315
##      4        1.0418             nan     0.1000    0.0226
##      5        0.9901             nan     0.1000    0.0228
##      6        0.9494             nan     0.1000    0.0177
##      7        0.9152             nan     0.1000    0.0138
##      8        0.8819             nan     0.1000    0.0136
##      9        0.8503             nan     0.1000    0.0113
##     10        0.8236             nan     0.1000    0.0109
##     20        0.6410             nan     0.1000    0.0018
##     40        0.4907             nan     0.1000   -0.0004
##     60        0.3960             nan     0.1000   -0.0019
##     80        0.3252             nan     0.1000   -0.0007
##    100        0.2745             nan     0.1000   -0.0008
##    120        0.2306             nan     0.1000    0.0002
##    140        0.1956             nan     0.1000   -0.0003
##    160        0.1687             nan     0.1000   -0.0002
##    180        0.1461             nan     0.1000   -0.0009
##    200        0.1275             nan     0.1000   -0.0009
##    220        0.1099             nan     0.1000   -0.0002
##    240        0.0967             nan     0.1000   -0.0006
##    260        0.0842             nan     0.1000   -0.0003
##    280        0.0726             nan     0.1000   -0.0003
##    300        0.0632             nan     0.1000   -0.0004
##    320        0.0555             nan     0.1000   -0.0000
##    340        0.0489             nan     0.1000   -0.0003
##    360        0.0429             nan     0.1000   -0.0002
##    380        0.0380             nan     0.1000   -0.0001
##    400        0.0332             nan     0.1000   -0.0002
##    420        0.0291             nan     0.1000   -0.0001
##    440        0.0258             nan     0.1000   -0.0002
##    460        0.0231             nan     0.1000   -0.0001
##    480        0.0206             nan     0.1000   -0.0001
##    500        0.0183             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0003
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3137             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0005
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2868             nan     0.0010    0.0003
##     60        1.2709             nan     0.0010    0.0003
##     80        1.2556             nan     0.0010    0.0004
##    100        1.2410             nan     0.0010    0.0003
##    120        1.2265             nan     0.0010    0.0003
##    140        1.2123             nan     0.0010    0.0003
##    160        1.1986             nan     0.0010    0.0003
##    180        1.1853             nan     0.0010    0.0003
##    200        1.1724             nan     0.0010    0.0003
##    220        1.1598             nan     0.0010    0.0003
##    240        1.1475             nan     0.0010    0.0003
##    260        1.1353             nan     0.0010    0.0003
##    280        1.1237             nan     0.0010    0.0002
##    300        1.1124             nan     0.0010    0.0003
##    320        1.1013             nan     0.0010    0.0002
##    340        1.0908             nan     0.0010    0.0002
##    360        1.0806             nan     0.0010    0.0002
##    380        1.0705             nan     0.0010    0.0002
##    400        1.0607             nan     0.0010    0.0002
##    420        1.0512             nan     0.0010    0.0002
##    440        1.0422             nan     0.0010    0.0002
##    460        1.0331             nan     0.0010    0.0002
##    480        1.0243             nan     0.0010    0.0002
##    500        1.0158             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3034             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0004
##     60        1.2707             nan     0.0010    0.0004
##     80        1.2551             nan     0.0010    0.0004
##    100        1.2401             nan     0.0010    0.0003
##    120        1.2254             nan     0.0010    0.0003
##    140        1.2112             nan     0.0010    0.0003
##    160        1.1974             nan     0.0010    0.0003
##    180        1.1842             nan     0.0010    0.0003
##    200        1.1713             nan     0.0010    0.0003
##    220        1.1589             nan     0.0010    0.0003
##    240        1.1465             nan     0.0010    0.0003
##    260        1.1348             nan     0.0010    0.0003
##    280        1.1232             nan     0.0010    0.0003
##    300        1.1122             nan     0.0010    0.0002
##    320        1.1013             nan     0.0010    0.0002
##    340        1.0909             nan     0.0010    0.0002
##    360        1.0805             nan     0.0010    0.0002
##    380        1.0705             nan     0.0010    0.0002
##    400        1.0609             nan     0.0010    0.0002
##    420        1.0514             nan     0.0010    0.0002
##    440        1.0420             nan     0.0010    0.0002
##    460        1.0331             nan     0.0010    0.0002
##    480        1.0243             nan     0.0010    0.0002
##    500        1.0159             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0003
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0003
##      9        1.3128             nan     0.0010    0.0003
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2870             nan     0.0010    0.0004
##     60        1.2712             nan     0.0010    0.0003
##     80        1.2559             nan     0.0010    0.0004
##    100        1.2409             nan     0.0010    0.0003
##    120        1.2265             nan     0.0010    0.0003
##    140        1.2124             nan     0.0010    0.0003
##    160        1.1989             nan     0.0010    0.0003
##    180        1.1858             nan     0.0010    0.0003
##    200        1.1728             nan     0.0010    0.0003
##    220        1.1602             nan     0.0010    0.0003
##    240        1.1481             nan     0.0010    0.0002
##    260        1.1366             nan     0.0010    0.0003
##    280        1.1252             nan     0.0010    0.0002
##    300        1.1141             nan     0.0010    0.0002
##    320        1.1033             nan     0.0010    0.0002
##    340        1.0927             nan     0.0010    0.0003
##    360        1.0826             nan     0.0010    0.0002
##    380        1.0724             nan     0.0010    0.0002
##    400        1.0628             nan     0.0010    0.0002
##    420        1.0533             nan     0.0010    0.0002
##    440        1.0442             nan     0.0010    0.0002
##    460        1.0351             nan     0.0010    0.0002
##    480        1.0266             nan     0.0010    0.0002
##    500        1.0180             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3167             nan     0.0010    0.0005
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3147             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3120             nan     0.0010    0.0005
##     10        1.3111             nan     0.0010    0.0004
##     20        1.3022             nan     0.0010    0.0004
##     40        1.2844             nan     0.0010    0.0004
##     60        1.2671             nan     0.0010    0.0004
##     80        1.2505             nan     0.0010    0.0003
##    100        1.2348             nan     0.0010    0.0003
##    120        1.2192             nan     0.0010    0.0003
##    140        1.2043             nan     0.0010    0.0003
##    160        1.1895             nan     0.0010    0.0003
##    180        1.1753             nan     0.0010    0.0003
##    200        1.1616             nan     0.0010    0.0003
##    220        1.1482             nan     0.0010    0.0003
##    240        1.1352             nan     0.0010    0.0003
##    260        1.1225             nan     0.0010    0.0003
##    280        1.1103             nan     0.0010    0.0003
##    300        1.0985             nan     0.0010    0.0003
##    320        1.0872             nan     0.0010    0.0003
##    340        1.0760             nan     0.0010    0.0003
##    360        1.0650             nan     0.0010    0.0002
##    380        1.0545             nan     0.0010    0.0002
##    400        1.0442             nan     0.0010    0.0002
##    420        1.0344             nan     0.0010    0.0002
##    440        1.0244             nan     0.0010    0.0002
##    460        1.0147             nan     0.0010    0.0002
##    480        1.0054             nan     0.0010    0.0002
##    500        0.9964             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0005
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2847             nan     0.0010    0.0003
##     60        1.2678             nan     0.0010    0.0004
##     80        1.2512             nan     0.0010    0.0003
##    100        1.2352             nan     0.0010    0.0003
##    120        1.2192             nan     0.0010    0.0003
##    140        1.2042             nan     0.0010    0.0004
##    160        1.1898             nan     0.0010    0.0003
##    180        1.1758             nan     0.0010    0.0003
##    200        1.1619             nan     0.0010    0.0003
##    220        1.1487             nan     0.0010    0.0003
##    240        1.1357             nan     0.0010    0.0003
##    260        1.1232             nan     0.0010    0.0003
##    280        1.1112             nan     0.0010    0.0002
##    300        1.0994             nan     0.0010    0.0002
##    320        1.0880             nan     0.0010    0.0003
##    340        1.0769             nan     0.0010    0.0002
##    360        1.0660             nan     0.0010    0.0002
##    380        1.0556             nan     0.0010    0.0003
##    400        1.0453             nan     0.0010    0.0002
##    420        1.0353             nan     0.0010    0.0002
##    440        1.0254             nan     0.0010    0.0002
##    460        1.0159             nan     0.0010    0.0002
##    480        1.0065             nan     0.0010    0.0002
##    500        0.9976             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0005
##     60        1.2677             nan     0.0010    0.0004
##     80        1.2516             nan     0.0010    0.0004
##    100        1.2356             nan     0.0010    0.0004
##    120        1.2200             nan     0.0010    0.0003
##    140        1.2050             nan     0.0010    0.0003
##    160        1.1905             nan     0.0010    0.0003
##    180        1.1769             nan     0.0010    0.0003
##    200        1.1632             nan     0.0010    0.0003
##    220        1.1500             nan     0.0010    0.0003
##    240        1.1368             nan     0.0010    0.0003
##    260        1.1246             nan     0.0010    0.0003
##    280        1.1125             nan     0.0010    0.0002
##    300        1.1010             nan     0.0010    0.0002
##    320        1.0898             nan     0.0010    0.0003
##    340        1.0790             nan     0.0010    0.0002
##    360        1.0682             nan     0.0010    0.0002
##    380        1.0578             nan     0.0010    0.0002
##    400        1.0479             nan     0.0010    0.0002
##    420        1.0381             nan     0.0010    0.0002
##    440        1.0284             nan     0.0010    0.0002
##    460        1.0191             nan     0.0010    0.0002
##    480        1.0102             nan     0.0010    0.0002
##    500        1.0012             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3196             nan     0.0010    0.0005
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3176             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3147             nan     0.0010    0.0005
##      7        1.3138             nan     0.0010    0.0005
##      8        1.3128             nan     0.0010    0.0005
##      9        1.3118             nan     0.0010    0.0004
##     10        1.3109             nan     0.0010    0.0004
##     20        1.3013             nan     0.0010    0.0004
##     40        1.2830             nan     0.0010    0.0004
##     60        1.2648             nan     0.0010    0.0004
##     80        1.2474             nan     0.0010    0.0004
##    100        1.2304             nan     0.0010    0.0004
##    120        1.2138             nan     0.0010    0.0004
##    140        1.1979             nan     0.0010    0.0004
##    160        1.1823             nan     0.0010    0.0004
##    180        1.1677             nan     0.0010    0.0003
##    200        1.1531             nan     0.0010    0.0003
##    220        1.1393             nan     0.0010    0.0003
##    240        1.1260             nan     0.0010    0.0003
##    260        1.1130             nan     0.0010    0.0003
##    280        1.1001             nan     0.0010    0.0003
##    300        1.0877             nan     0.0010    0.0003
##    320        1.0756             nan     0.0010    0.0002
##    340        1.0640             nan     0.0010    0.0003
##    360        1.0523             nan     0.0010    0.0003
##    380        1.0413             nan     0.0010    0.0002
##    400        1.0304             nan     0.0010    0.0002
##    420        1.0199             nan     0.0010    0.0002
##    440        1.0098             nan     0.0010    0.0002
##    460        0.9997             nan     0.0010    0.0002
##    480        0.9898             nan     0.0010    0.0002
##    500        0.9800             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0005
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3148             nan     0.0010    0.0005
##      7        1.3139             nan     0.0010    0.0004
##      8        1.3130             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3111             nan     0.0010    0.0004
##     20        1.3015             nan     0.0010    0.0004
##     40        1.2828             nan     0.0010    0.0004
##     60        1.2650             nan     0.0010    0.0004
##     80        1.2478             nan     0.0010    0.0003
##    100        1.2309             nan     0.0010    0.0004
##    120        1.2148             nan     0.0010    0.0003
##    140        1.1993             nan     0.0010    0.0003
##    160        1.1836             nan     0.0010    0.0003
##    180        1.1690             nan     0.0010    0.0003
##    200        1.1548             nan     0.0010    0.0003
##    220        1.1411             nan     0.0010    0.0003
##    240        1.1276             nan     0.0010    0.0003
##    260        1.1147             nan     0.0010    0.0003
##    280        1.1021             nan     0.0010    0.0003
##    300        1.0896             nan     0.0010    0.0003
##    320        1.0772             nan     0.0010    0.0002
##    340        1.0656             nan     0.0010    0.0002
##    360        1.0544             nan     0.0010    0.0002
##    380        1.0434             nan     0.0010    0.0002
##    400        1.0328             nan     0.0010    0.0002
##    420        1.0226             nan     0.0010    0.0002
##    440        1.0123             nan     0.0010    0.0002
##    460        1.0024             nan     0.0010    0.0002
##    480        0.9927             nan     0.0010    0.0002
##    500        0.9832             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3157             nan     0.0010    0.0004
##      6        1.3147             nan     0.0010    0.0005
##      7        1.3138             nan     0.0010    0.0004
##      8        1.3129             nan     0.0010    0.0004
##      9        1.3119             nan     0.0010    0.0005
##     10        1.3109             nan     0.0010    0.0004
##     20        1.3013             nan     0.0010    0.0004
##     40        1.2830             nan     0.0010    0.0004
##     60        1.2657             nan     0.0010    0.0004
##     80        1.2484             nan     0.0010    0.0004
##    100        1.2321             nan     0.0010    0.0004
##    120        1.2161             nan     0.0010    0.0004
##    140        1.2007             nan     0.0010    0.0003
##    160        1.1858             nan     0.0010    0.0003
##    180        1.1711             nan     0.0010    0.0003
##    200        1.1572             nan     0.0010    0.0003
##    220        1.1435             nan     0.0010    0.0003
##    240        1.1303             nan     0.0010    0.0003
##    260        1.1174             nan     0.0010    0.0003
##    280        1.1051             nan     0.0010    0.0002
##    300        1.0930             nan     0.0010    0.0003
##    320        1.0814             nan     0.0010    0.0002
##    340        1.0697             nan     0.0010    0.0003
##    360        1.0586             nan     0.0010    0.0002
##    380        1.0479             nan     0.0010    0.0002
##    400        1.0373             nan     0.0010    0.0002
##    420        1.0274             nan     0.0010    0.0002
##    440        1.0172             nan     0.0010    0.0002
##    460        1.0075             nan     0.0010    0.0002
##    480        0.9981             nan     0.0010    0.0002
##    500        0.9890             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0035
##      2        1.3041             nan     0.0100    0.0040
##      3        1.2960             nan     0.0100    0.0035
##      4        1.2883             nan     0.0100    0.0031
##      5        1.2802             nan     0.0100    0.0038
##      6        1.2715             nan     0.0100    0.0035
##      7        1.2636             nan     0.0100    0.0035
##      8        1.2552             nan     0.0100    0.0035
##      9        1.2476             nan     0.0100    0.0033
##     10        1.2399             nan     0.0100    0.0037
##     20        1.1685             nan     0.0100    0.0031
##     40        1.0582             nan     0.0100    0.0024
##     60        0.9744             nan     0.0100    0.0016
##     80        0.9076             nan     0.0100    0.0011
##    100        0.8542             nan     0.0100    0.0010
##    120        0.8104             nan     0.0100    0.0006
##    140        0.7743             nan     0.0100    0.0006
##    160        0.7446             nan     0.0100    0.0004
##    180        0.7176             nan     0.0100    0.0002
##    200        0.6940             nan     0.0100    0.0003
##    220        0.6731             nan     0.0100    0.0003
##    240        0.6559             nan     0.0100    0.0002
##    260        0.6391             nan     0.0100    0.0002
##    280        0.6233             nan     0.0100    0.0000
##    300        0.6093             nan     0.0100    0.0000
##    320        0.5974             nan     0.0100    0.0002
##    340        0.5841             nan     0.0100    0.0001
##    360        0.5734             nan     0.0100    0.0002
##    380        0.5624             nan     0.0100   -0.0001
##    400        0.5524             nan     0.0100   -0.0001
##    420        0.5432             nan     0.0100   -0.0002
##    440        0.5348             nan     0.0100   -0.0000
##    460        0.5267             nan     0.0100   -0.0000
##    480        0.5180             nan     0.0100   -0.0000
##    500        0.5097             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0041
##      2        1.3035             nan     0.0100    0.0039
##      3        1.2949             nan     0.0100    0.0039
##      4        1.2860             nan     0.0100    0.0040
##      5        1.2791             nan     0.0100    0.0034
##      6        1.2707             nan     0.0100    0.0037
##      7        1.2632             nan     0.0100    0.0035
##      8        1.2561             nan     0.0100    0.0031
##      9        1.2492             nan     0.0100    0.0032
##     10        1.2419             nan     0.0100    0.0034
##     20        1.1738             nan     0.0100    0.0030
##     40        1.0623             nan     0.0100    0.0021
##     60        0.9771             nan     0.0100    0.0018
##     80        0.9114             nan     0.0100    0.0011
##    100        0.8583             nan     0.0100    0.0009
##    120        0.8139             nan     0.0100    0.0006
##    140        0.7772             nan     0.0100    0.0006
##    160        0.7470             nan     0.0100    0.0003
##    180        0.7216             nan     0.0100    0.0004
##    200        0.6986             nan     0.0100    0.0002
##    220        0.6792             nan     0.0100    0.0004
##    240        0.6609             nan     0.0100    0.0002
##    260        0.6458             nan     0.0100    0.0002
##    280        0.6304             nan     0.0100   -0.0001
##    300        0.6169             nan     0.0100    0.0001
##    320        0.6033             nan     0.0100    0.0001
##    340        0.5910             nan     0.0100    0.0000
##    360        0.5800             nan     0.0100   -0.0001
##    380        0.5697             nan     0.0100   -0.0000
##    400        0.5594             nan     0.0100   -0.0001
##    420        0.5498             nan     0.0100   -0.0001
##    440        0.5402             nan     0.0100   -0.0001
##    460        0.5307             nan     0.0100    0.0001
##    480        0.5226             nan     0.0100    0.0000
##    500        0.5145             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0040
##      2        1.3030             nan     0.0100    0.0037
##      3        1.2947             nan     0.0100    0.0042
##      4        1.2870             nan     0.0100    0.0035
##      5        1.2790             nan     0.0100    0.0036
##      6        1.2711             nan     0.0100    0.0032
##      7        1.2635             nan     0.0100    0.0034
##      8        1.2552             nan     0.0100    0.0040
##      9        1.2482             nan     0.0100    0.0031
##     10        1.2406             nan     0.0100    0.0035
##     20        1.1719             nan     0.0100    0.0026
##     40        1.0614             nan     0.0100    0.0019
##     60        0.9790             nan     0.0100    0.0013
##     80        0.9124             nan     0.0100    0.0011
##    100        0.8585             nan     0.0100    0.0009
##    120        0.8145             nan     0.0100    0.0006
##    140        0.7778             nan     0.0100    0.0006
##    160        0.7481             nan     0.0100    0.0005
##    180        0.7227             nan     0.0100    0.0001
##    200        0.7005             nan     0.0100    0.0003
##    220        0.6810             nan     0.0100    0.0000
##    240        0.6631             nan     0.0100    0.0002
##    260        0.6483             nan     0.0100    0.0000
##    280        0.6337             nan     0.0100    0.0002
##    300        0.6198             nan     0.0100    0.0000
##    320        0.6089             nan     0.0100    0.0001
##    340        0.5970             nan     0.0100   -0.0002
##    360        0.5860             nan     0.0100    0.0001
##    380        0.5754             nan     0.0100   -0.0001
##    400        0.5654             nan     0.0100   -0.0000
##    420        0.5562             nan     0.0100   -0.0001
##    440        0.5467             nan     0.0100   -0.0000
##    460        0.5382             nan     0.0100   -0.0002
##    480        0.5301             nan     0.0100   -0.0000
##    500        0.5211             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0038
##      2        1.3033             nan     0.0100    0.0040
##      3        1.2943             nan     0.0100    0.0040
##      4        1.2859             nan     0.0100    0.0031
##      5        1.2770             nan     0.0100    0.0037
##      6        1.2688             nan     0.0100    0.0035
##      7        1.2599             nan     0.0100    0.0037
##      8        1.2526             nan     0.0100    0.0034
##      9        1.2444             nan     0.0100    0.0034
##     10        1.2364             nan     0.0100    0.0032
##     20        1.1644             nan     0.0100    0.0032
##     40        1.0444             nan     0.0100    0.0021
##     60        0.9515             nan     0.0100    0.0019
##     80        0.8801             nan     0.0100    0.0013
##    100        0.8239             nan     0.0100    0.0010
##    120        0.7780             nan     0.0100    0.0008
##    140        0.7400             nan     0.0100    0.0004
##    160        0.7080             nan     0.0100    0.0004
##    180        0.6802             nan     0.0100    0.0003
##    200        0.6561             nan     0.0100    0.0001
##    220        0.6334             nan     0.0100    0.0002
##    240        0.6133             nan     0.0100   -0.0000
##    260        0.5954             nan     0.0100    0.0001
##    280        0.5805             nan     0.0100    0.0001
##    300        0.5648             nan     0.0100    0.0001
##    320        0.5510             nan     0.0100    0.0001
##    340        0.5388             nan     0.0100   -0.0001
##    360        0.5264             nan     0.0100    0.0001
##    380        0.5155             nan     0.0100    0.0001
##    400        0.5043             nan     0.0100    0.0000
##    420        0.4933             nan     0.0100    0.0002
##    440        0.4836             nan     0.0100   -0.0000
##    460        0.4739             nan     0.0100    0.0001
##    480        0.4654             nan     0.0100   -0.0001
##    500        0.4572             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0042
##      2        1.3023             nan     0.0100    0.0035
##      3        1.2943             nan     0.0100    0.0034
##      4        1.2859             nan     0.0100    0.0037
##      5        1.2772             nan     0.0100    0.0038
##      6        1.2683             nan     0.0100    0.0041
##      7        1.2595             nan     0.0100    0.0036
##      8        1.2515             nan     0.0100    0.0038
##      9        1.2436             nan     0.0100    0.0037
##     10        1.2348             nan     0.0100    0.0035
##     20        1.1624             nan     0.0100    0.0026
##     40        1.0462             nan     0.0100    0.0023
##     60        0.9555             nan     0.0100    0.0017
##     80        0.8847             nan     0.0100    0.0014
##    100        0.8285             nan     0.0100    0.0011
##    120        0.7841             nan     0.0100    0.0008
##    140        0.7476             nan     0.0100    0.0007
##    160        0.7164             nan     0.0100    0.0004
##    180        0.6886             nan     0.0100    0.0003
##    200        0.6638             nan     0.0100    0.0003
##    220        0.6414             nan     0.0100    0.0003
##    240        0.6212             nan     0.0100    0.0002
##    260        0.6050             nan     0.0100    0.0002
##    280        0.5893             nan     0.0100    0.0001
##    300        0.5746             nan     0.0100    0.0000
##    320        0.5615             nan     0.0100    0.0000
##    340        0.5485             nan     0.0100    0.0000
##    360        0.5356             nan     0.0100   -0.0001
##    380        0.5249             nan     0.0100    0.0001
##    400        0.5139             nan     0.0100   -0.0002
##    420        0.5031             nan     0.0100   -0.0000
##    440        0.4929             nan     0.0100   -0.0001
##    460        0.4836             nan     0.0100   -0.0000
##    480        0.4744             nan     0.0100   -0.0000
##    500        0.4657             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0040
##      2        1.3035             nan     0.0100    0.0038
##      3        1.2942             nan     0.0100    0.0044
##      4        1.2859             nan     0.0100    0.0036
##      5        1.2773             nan     0.0100    0.0035
##      6        1.2691             nan     0.0100    0.0036
##      7        1.2608             nan     0.0100    0.0041
##      8        1.2528             nan     0.0100    0.0036
##      9        1.2449             nan     0.0100    0.0038
##     10        1.2370             nan     0.0100    0.0031
##     20        1.1634             nan     0.0100    0.0032
##     40        1.0476             nan     0.0100    0.0021
##     60        0.9584             nan     0.0100    0.0015
##     80        0.8901             nan     0.0100    0.0013
##    100        0.8341             nan     0.0100    0.0012
##    120        0.7869             nan     0.0100    0.0009
##    140        0.7485             nan     0.0100    0.0004
##    160        0.7172             nan     0.0100    0.0004
##    180        0.6896             nan     0.0100    0.0002
##    200        0.6655             nan     0.0100    0.0002
##    220        0.6438             nan     0.0100    0.0001
##    240        0.6256             nan     0.0100    0.0003
##    260        0.6092             nan     0.0100    0.0000
##    280        0.5937             nan     0.0100    0.0000
##    300        0.5797             nan     0.0100    0.0000
##    320        0.5653             nan     0.0100    0.0001
##    340        0.5520             nan     0.0100    0.0000
##    360        0.5398             nan     0.0100   -0.0001
##    380        0.5291             nan     0.0100    0.0000
##    400        0.5179             nan     0.0100   -0.0000
##    420        0.5075             nan     0.0100    0.0000
##    440        0.4985             nan     0.0100   -0.0000
##    460        0.4893             nan     0.0100    0.0001
##    480        0.4798             nan     0.0100   -0.0000
##    500        0.4713             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0040
##      2        1.3024             nan     0.0100    0.0039
##      3        1.2933             nan     0.0100    0.0040
##      4        1.2835             nan     0.0100    0.0046
##      5        1.2744             nan     0.0100    0.0043
##      6        1.2657             nan     0.0100    0.0040
##      7        1.2564             nan     0.0100    0.0043
##      8        1.2478             nan     0.0100    0.0040
##      9        1.2388             nan     0.0100    0.0040
##     10        1.2307             nan     0.0100    0.0033
##     20        1.1550             nan     0.0100    0.0028
##     40        1.0318             nan     0.0100    0.0024
##     60        0.9370             nan     0.0100    0.0018
##     80        0.8632             nan     0.0100    0.0014
##    100        0.8060             nan     0.0100    0.0006
##    120        0.7585             nan     0.0100    0.0006
##    140        0.7185             nan     0.0100    0.0005
##    160        0.6860             nan     0.0100    0.0005
##    180        0.6549             nan     0.0100    0.0007
##    200        0.6284             nan     0.0100    0.0002
##    220        0.6048             nan     0.0100    0.0002
##    240        0.5850             nan     0.0100    0.0000
##    260        0.5662             nan     0.0100    0.0001
##    280        0.5498             nan     0.0100    0.0000
##    300        0.5340             nan     0.0100    0.0001
##    320        0.5185             nan     0.0100    0.0001
##    340        0.5039             nan     0.0100   -0.0000
##    360        0.4912             nan     0.0100    0.0000
##    380        0.4786             nan     0.0100   -0.0001
##    400        0.4659             nan     0.0100   -0.0000
##    420        0.4546             nan     0.0100   -0.0001
##    440        0.4435             nan     0.0100    0.0001
##    460        0.4334             nan     0.0100    0.0001
##    480        0.4244             nan     0.0100   -0.0000
##    500        0.4144             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0038
##      2        1.3019             nan     0.0100    0.0047
##      3        1.2910             nan     0.0100    0.0048
##      4        1.2805             nan     0.0100    0.0047
##      5        1.2723             nan     0.0100    0.0040
##      6        1.2631             nan     0.0100    0.0043
##      7        1.2537             nan     0.0100    0.0042
##      8        1.2445             nan     0.0100    0.0039
##      9        1.2364             nan     0.0100    0.0040
##     10        1.2288             nan     0.0100    0.0031
##     20        1.1542             nan     0.0100    0.0031
##     40        1.0337             nan     0.0100    0.0027
##     60        0.9395             nan     0.0100    0.0017
##     80        0.8667             nan     0.0100    0.0014
##    100        0.8064             nan     0.0100    0.0013
##    120        0.7595             nan     0.0100    0.0006
##    140        0.7220             nan     0.0100    0.0005
##    160        0.6886             nan     0.0100    0.0003
##    180        0.6611             nan     0.0100    0.0001
##    200        0.6346             nan     0.0100    0.0004
##    220        0.6110             nan     0.0100    0.0001
##    240        0.5898             nan     0.0100    0.0004
##    260        0.5711             nan     0.0100    0.0002
##    280        0.5545             nan     0.0100    0.0002
##    300        0.5386             nan     0.0100    0.0000
##    320        0.5245             nan     0.0100    0.0001
##    340        0.5109             nan     0.0100   -0.0001
##    360        0.4976             nan     0.0100   -0.0001
##    380        0.4860             nan     0.0100    0.0000
##    400        0.4746             nan     0.0100    0.0001
##    420        0.4636             nan     0.0100    0.0000
##    440        0.4527             nan     0.0100    0.0001
##    460        0.4419             nan     0.0100   -0.0001
##    480        0.4326             nan     0.0100   -0.0001
##    500        0.4233             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3099             nan     0.0100    0.0047
##      2        1.3005             nan     0.0100    0.0045
##      3        1.2915             nan     0.0100    0.0043
##      4        1.2828             nan     0.0100    0.0037
##      5        1.2735             nan     0.0100    0.0044
##      6        1.2647             nan     0.0100    0.0041
##      7        1.2567             nan     0.0100    0.0032
##      8        1.2482             nan     0.0100    0.0038
##      9        1.2393             nan     0.0100    0.0041
##     10        1.2317             nan     0.0100    0.0035
##     20        1.1549             nan     0.0100    0.0037
##     40        1.0338             nan     0.0100    0.0027
##     60        0.9418             nan     0.0100    0.0018
##     80        0.8711             nan     0.0100    0.0012
##    100        0.8134             nan     0.0100    0.0011
##    120        0.7669             nan     0.0100    0.0006
##    140        0.7268             nan     0.0100    0.0007
##    160        0.6945             nan     0.0100    0.0002
##    180        0.6654             nan     0.0100    0.0003
##    200        0.6415             nan     0.0100   -0.0001
##    220        0.6183             nan     0.0100    0.0002
##    240        0.5986             nan     0.0100    0.0000
##    260        0.5795             nan     0.0100    0.0000
##    280        0.5619             nan     0.0100    0.0002
##    300        0.5464             nan     0.0100    0.0000
##    320        0.5327             nan     0.0100    0.0001
##    340        0.5201             nan     0.0100   -0.0001
##    360        0.5077             nan     0.0100   -0.0003
##    380        0.4950             nan     0.0100    0.0000
##    400        0.4835             nan     0.0100   -0.0001
##    420        0.4721             nan     0.0100   -0.0001
##    440        0.4614             nan     0.0100    0.0000
##    460        0.4519             nan     0.0100   -0.0000
##    480        0.4422             nan     0.0100    0.0001
##    500        0.4331             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2354             nan     0.1000    0.0362
##      2        1.1710             nan     0.1000    0.0260
##      3        1.1130             nan     0.1000    0.0255
##      4        1.0554             nan     0.1000    0.0243
##      5        1.0034             nan     0.1000    0.0214
##      6        0.9617             nan     0.1000    0.0186
##      7        0.9263             nan     0.1000    0.0130
##      8        0.8922             nan     0.1000    0.0150
##      9        0.8658             nan     0.1000    0.0113
##     10        0.8426             nan     0.1000    0.0087
##     20        0.6823             nan     0.1000    0.0042
##     40        0.5503             nan     0.1000   -0.0005
##     60        0.4665             nan     0.1000    0.0003
##     80        0.4030             nan     0.1000   -0.0005
##    100        0.3523             nan     0.1000   -0.0002
##    120        0.3096             nan     0.1000   -0.0001
##    140        0.2773             nan     0.1000   -0.0003
##    160        0.2490             nan     0.1000   -0.0005
##    180        0.2282             nan     0.1000   -0.0002
##    200        0.2048             nan     0.1000   -0.0008
##    220        0.1834             nan     0.1000   -0.0004
##    240        0.1651             nan     0.1000   -0.0002
##    260        0.1507             nan     0.1000   -0.0003
##    280        0.1360             nan     0.1000   -0.0006
##    300        0.1234             nan     0.1000   -0.0000
##    320        0.1124             nan     0.1000   -0.0001
##    340        0.1035             nan     0.1000   -0.0002
##    360        0.0952             nan     0.1000   -0.0002
##    380        0.0872             nan     0.1000   -0.0003
##    400        0.0799             nan     0.1000   -0.0003
##    420        0.0731             nan     0.1000   -0.0005
##    440        0.0666             nan     0.1000   -0.0001
##    460        0.0617             nan     0.1000   -0.0001
##    480        0.0563             nan     0.1000   -0.0001
##    500        0.0521             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2363             nan     0.1000    0.0415
##      2        1.1683             nan     0.1000    0.0301
##      3        1.1056             nan     0.1000    0.0283
##      4        1.0535             nan     0.1000    0.0223
##      5        1.0095             nan     0.1000    0.0197
##      6        0.9746             nan     0.1000    0.0136
##      7        0.9401             nan     0.1000    0.0132
##      8        0.9106             nan     0.1000    0.0123
##      9        0.8821             nan     0.1000    0.0104
##     10        0.8532             nan     0.1000    0.0111
##     20        0.6928             nan     0.1000    0.0019
##     40        0.5538             nan     0.1000   -0.0002
##     60        0.4785             nan     0.1000   -0.0003
##     80        0.4160             nan     0.1000   -0.0000
##    100        0.3627             nan     0.1000   -0.0002
##    120        0.3154             nan     0.1000   -0.0001
##    140        0.2827             nan     0.1000   -0.0006
##    160        0.2513             nan     0.1000   -0.0002
##    180        0.2274             nan     0.1000   -0.0004
##    200        0.2078             nan     0.1000    0.0000
##    220        0.1901             nan     0.1000   -0.0002
##    240        0.1726             nan     0.1000   -0.0007
##    260        0.1559             nan     0.1000   -0.0004
##    280        0.1433             nan     0.1000   -0.0006
##    300        0.1309             nan     0.1000   -0.0006
##    320        0.1189             nan     0.1000   -0.0001
##    340        0.1096             nan     0.1000   -0.0002
##    360        0.1010             nan     0.1000   -0.0005
##    380        0.0922             nan     0.1000   -0.0003
##    400        0.0844             nan     0.1000   -0.0003
##    420        0.0779             nan     0.1000   -0.0002
##    440        0.0714             nan     0.1000   -0.0003
##    460        0.0661             nan     0.1000   -0.0001
##    480        0.0609             nan     0.1000    0.0001
##    500        0.0563             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2340             nan     0.1000    0.0385
##      2        1.1709             nan     0.1000    0.0248
##      3        1.1110             nan     0.1000    0.0240
##      4        1.0545             nan     0.1000    0.0271
##      5        1.0102             nan     0.1000    0.0194
##      6        0.9686             nan     0.1000    0.0198
##      7        0.9356             nan     0.1000    0.0119
##      8        0.9073             nan     0.1000    0.0133
##      9        0.8794             nan     0.1000    0.0123
##     10        0.8561             nan     0.1000    0.0103
##     20        0.6948             nan     0.1000    0.0028
##     40        0.5612             nan     0.1000   -0.0014
##     60        0.4826             nan     0.1000    0.0001
##     80        0.4252             nan     0.1000    0.0009
##    100        0.3733             nan     0.1000    0.0004
##    120        0.3315             nan     0.1000   -0.0020
##    140        0.3007             nan     0.1000   -0.0008
##    160        0.2701             nan     0.1000   -0.0010
##    180        0.2413             nan     0.1000   -0.0006
##    200        0.2201             nan     0.1000   -0.0011
##    220        0.1977             nan     0.1000   -0.0010
##    240        0.1781             nan     0.1000   -0.0002
##    260        0.1608             nan     0.1000   -0.0008
##    280        0.1489             nan     0.1000   -0.0003
##    300        0.1367             nan     0.1000   -0.0006
##    320        0.1239             nan     0.1000   -0.0007
##    340        0.1144             nan     0.1000   -0.0003
##    360        0.1074             nan     0.1000   -0.0004
##    380        0.0989             nan     0.1000   -0.0004
##    400        0.0919             nan     0.1000   -0.0005
##    420        0.0840             nan     0.1000   -0.0001
##    440        0.0776             nan     0.1000   -0.0003
##    460        0.0715             nan     0.1000   -0.0002
##    480        0.0658             nan     0.1000   -0.0002
##    500        0.0606             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2294             nan     0.1000    0.0416
##      2        1.1550             nan     0.1000    0.0317
##      3        1.0948             nan     0.1000    0.0243
##      4        1.0397             nan     0.1000    0.0238
##      5        0.9916             nan     0.1000    0.0188
##      6        0.9509             nan     0.1000    0.0197
##      7        0.9172             nan     0.1000    0.0111
##      8        0.8822             nan     0.1000    0.0146
##      9        0.8548             nan     0.1000    0.0095
##     10        0.8260             nan     0.1000    0.0107
##     20        0.6525             nan     0.1000    0.0026
##     40        0.5086             nan     0.1000   -0.0007
##     60        0.4250             nan     0.1000   -0.0004
##     80        0.3579             nan     0.1000   -0.0007
##    100        0.3044             nan     0.1000    0.0001
##    120        0.2673             nan     0.1000   -0.0001
##    140        0.2299             nan     0.1000   -0.0007
##    160        0.1999             nan     0.1000   -0.0004
##    180        0.1775             nan     0.1000   -0.0007
##    200        0.1541             nan     0.1000   -0.0004
##    220        0.1358             nan     0.1000   -0.0001
##    240        0.1197             nan     0.1000   -0.0005
##    260        0.1076             nan     0.1000    0.0000
##    280        0.0963             nan     0.1000   -0.0001
##    300        0.0860             nan     0.1000   -0.0001
##    320        0.0765             nan     0.1000   -0.0003
##    340        0.0697             nan     0.1000   -0.0002
##    360        0.0629             nan     0.1000   -0.0001
##    380        0.0569             nan     0.1000   -0.0001
##    400        0.0514             nan     0.1000   -0.0001
##    420        0.0463             nan     0.1000   -0.0001
##    440        0.0417             nan     0.1000   -0.0003
##    460        0.0377             nan     0.1000   -0.0001
##    480        0.0340             nan     0.1000   -0.0001
##    500        0.0307             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2319             nan     0.1000    0.0388
##      2        1.1571             nan     0.1000    0.0308
##      3        1.0934             nan     0.1000    0.0265
##      4        1.0392             nan     0.1000    0.0233
##      5        0.9942             nan     0.1000    0.0203
##      6        0.9508             nan     0.1000    0.0175
##      7        0.9136             nan     0.1000    0.0163
##      8        0.8850             nan     0.1000    0.0122
##      9        0.8568             nan     0.1000    0.0109
##     10        0.8311             nan     0.1000    0.0070
##     20        0.6659             nan     0.1000    0.0046
##     40        0.5144             nan     0.1000    0.0007
##     60        0.4251             nan     0.1000    0.0000
##     80        0.3657             nan     0.1000   -0.0014
##    100        0.3116             nan     0.1000   -0.0014
##    120        0.2711             nan     0.1000   -0.0004
##    140        0.2349             nan     0.1000   -0.0008
##    160        0.2067             nan     0.1000   -0.0003
##    180        0.1812             nan     0.1000   -0.0007
##    200        0.1579             nan     0.1000   -0.0004
##    220        0.1392             nan     0.1000   -0.0004
##    240        0.1228             nan     0.1000   -0.0005
##    260        0.1092             nan     0.1000   -0.0005
##    280        0.0964             nan     0.1000   -0.0002
##    300        0.0872             nan     0.1000   -0.0002
##    320        0.0787             nan     0.1000   -0.0002
##    340        0.0696             nan     0.1000    0.0000
##    360        0.0626             nan     0.1000   -0.0002
##    380        0.0567             nan     0.1000   -0.0003
##    400        0.0507             nan     0.1000   -0.0001
##    420        0.0460             nan     0.1000   -0.0001
##    440        0.0416             nan     0.1000   -0.0001
##    460        0.0378             nan     0.1000   -0.0002
##    480        0.0340             nan     0.1000   -0.0001
##    500        0.0305             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2418             nan     0.1000    0.0333
##      2        1.1624             nan     0.1000    0.0365
##      3        1.0999             nan     0.1000    0.0283
##      4        1.0443             nan     0.1000    0.0210
##      5        0.9975             nan     0.1000    0.0207
##      6        0.9558             nan     0.1000    0.0175
##      7        0.9186             nan     0.1000    0.0146
##      8        0.8840             nan     0.1000    0.0124
##      9        0.8533             nan     0.1000    0.0111
##     10        0.8218             nan     0.1000    0.0128
##     20        0.6663             nan     0.1000    0.0028
##     40        0.5271             nan     0.1000   -0.0000
##     60        0.4425             nan     0.1000   -0.0004
##     80        0.3795             nan     0.1000   -0.0011
##    100        0.3254             nan     0.1000   -0.0006
##    120        0.2800             nan     0.1000   -0.0010
##    140        0.2450             nan     0.1000   -0.0007
##    160        0.2099             nan     0.1000   -0.0006
##    180        0.1836             nan     0.1000   -0.0001
##    200        0.1603             nan     0.1000   -0.0008
##    220        0.1423             nan     0.1000   -0.0003
##    240        0.1259             nan     0.1000   -0.0005
##    260        0.1130             nan     0.1000   -0.0006
##    280        0.1017             nan     0.1000   -0.0003
##    300        0.0913             nan     0.1000   -0.0002
##    320        0.0823             nan     0.1000   -0.0008
##    340        0.0742             nan     0.1000   -0.0000
##    360        0.0659             nan     0.1000   -0.0002
##    380        0.0594             nan     0.1000   -0.0001
##    400        0.0538             nan     0.1000   -0.0002
##    420        0.0484             nan     0.1000   -0.0002
##    440        0.0439             nan     0.1000   -0.0001
##    460        0.0397             nan     0.1000   -0.0002
##    480        0.0361             nan     0.1000   -0.0001
##    500        0.0326             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2334             nan     0.1000    0.0370
##      2        1.1564             nan     0.1000    0.0347
##      3        1.0940             nan     0.1000    0.0257
##      4        1.0399             nan     0.1000    0.0230
##      5        0.9841             nan     0.1000    0.0225
##      6        0.9410             nan     0.1000    0.0173
##      7        0.9031             nan     0.1000    0.0162
##      8        0.8645             nan     0.1000    0.0164
##      9        0.8330             nan     0.1000    0.0141
##     10        0.8044             nan     0.1000    0.0103
##     20        0.6343             nan     0.1000    0.0035
##     40        0.4703             nan     0.1000   -0.0001
##     60        0.3712             nan     0.1000   -0.0002
##     80        0.3094             nan     0.1000   -0.0010
##    100        0.2570             nan     0.1000   -0.0003
##    120        0.2136             nan     0.1000   -0.0006
##    140        0.1824             nan     0.1000   -0.0004
##    160        0.1549             nan     0.1000   -0.0005
##    180        0.1319             nan     0.1000    0.0002
##    200        0.1128             nan     0.1000   -0.0003
##    220        0.0987             nan     0.1000   -0.0003
##    240        0.0855             nan     0.1000   -0.0002
##    260        0.0743             nan     0.1000   -0.0001
##    280        0.0650             nan     0.1000   -0.0003
##    300        0.0563             nan     0.1000   -0.0001
##    320        0.0494             nan     0.1000   -0.0000
##    340        0.0439             nan     0.1000   -0.0001
##    360        0.0392             nan     0.1000   -0.0001
##    380        0.0345             nan     0.1000   -0.0000
##    400        0.0304             nan     0.1000   -0.0001
##    420        0.0269             nan     0.1000   -0.0001
##    440        0.0237             nan     0.1000   -0.0001
##    460        0.0209             nan     0.1000   -0.0001
##    480        0.0184             nan     0.1000   -0.0001
##    500        0.0164             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2251             nan     0.1000    0.0421
##      2        1.1527             nan     0.1000    0.0330
##      3        1.0842             nan     0.1000    0.0295
##      4        1.0286             nan     0.1000    0.0230
##      5        0.9785             nan     0.1000    0.0200
##      6        0.9341             nan     0.1000    0.0174
##      7        0.8966             nan     0.1000    0.0144
##      8        0.8612             nan     0.1000    0.0119
##      9        0.8346             nan     0.1000    0.0089
##     10        0.8122             nan     0.1000    0.0099
##     20        0.6520             nan     0.1000    0.0021
##     40        0.4859             nan     0.1000   -0.0008
##     60        0.3838             nan     0.1000   -0.0001
##     80        0.3218             nan     0.1000   -0.0014
##    100        0.2645             nan     0.1000    0.0004
##    120        0.2221             nan     0.1000   -0.0006
##    140        0.1859             nan     0.1000   -0.0016
##    160        0.1596             nan     0.1000   -0.0006
##    180        0.1362             nan     0.1000   -0.0003
##    200        0.1172             nan     0.1000   -0.0005
##    220        0.1018             nan     0.1000   -0.0003
##    240        0.0885             nan     0.1000   -0.0003
##    260        0.0766             nan     0.1000   -0.0004
##    280        0.0671             nan     0.1000   -0.0001
##    300        0.0587             nan     0.1000   -0.0001
##    320        0.0517             nan     0.1000   -0.0002
##    340        0.0461             nan     0.1000   -0.0002
##    360        0.0401             nan     0.1000   -0.0000
##    380        0.0358             nan     0.1000   -0.0001
##    400        0.0322             nan     0.1000   -0.0000
##    420        0.0284             nan     0.1000   -0.0002
##    440        0.0250             nan     0.1000   -0.0000
##    460        0.0219             nan     0.1000   -0.0001
##    480        0.0194             nan     0.1000   -0.0002
##    500        0.0173             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2203             nan     0.1000    0.0452
##      2        1.1399             nan     0.1000    0.0359
##      3        1.0795             nan     0.1000    0.0277
##      4        1.0241             nan     0.1000    0.0238
##      5        0.9755             nan     0.1000    0.0216
##      6        0.9349             nan     0.1000    0.0168
##      7        0.9006             nan     0.1000    0.0127
##      8        0.8702             nan     0.1000    0.0119
##      9        0.8371             nan     0.1000    0.0120
##     10        0.8123             nan     0.1000    0.0084
##     20        0.6361             nan     0.1000    0.0026
##     40        0.4837             nan     0.1000    0.0016
##     60        0.3914             nan     0.1000   -0.0006
##     80        0.3241             nan     0.1000   -0.0000
##    100        0.2706             nan     0.1000   -0.0007
##    120        0.2256             nan     0.1000   -0.0001
##    140        0.1925             nan     0.1000   -0.0006
##    160        0.1648             nan     0.1000   -0.0006
##    180        0.1423             nan     0.1000   -0.0005
##    200        0.1226             nan     0.1000   -0.0006
##    220        0.1053             nan     0.1000   -0.0007
##    240        0.0920             nan     0.1000   -0.0005
##    260        0.0806             nan     0.1000   -0.0002
##    280        0.0703             nan     0.1000   -0.0003
##    300        0.0618             nan     0.1000   -0.0001
##    320        0.0549             nan     0.1000   -0.0004
##    340        0.0490             nan     0.1000    0.0001
##    360        0.0430             nan     0.1000   -0.0003
##    380        0.0384             nan     0.1000   -0.0002
##    400        0.0336             nan     0.1000   -0.0001
##    420        0.0299             nan     0.1000   -0.0001
##    440        0.0265             nan     0.1000   -0.0001
##    460        0.0235             nan     0.1000   -0.0001
##    480        0.0210             nan     0.1000   -0.0001
##    500        0.0186             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0003
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3172             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0003
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3041             nan     0.0010    0.0004
##     40        1.2882             nan     0.0010    0.0004
##     60        1.2729             nan     0.0010    0.0004
##     80        1.2584             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2303             nan     0.0010    0.0003
##    140        1.2168             nan     0.0010    0.0003
##    160        1.2040             nan     0.0010    0.0003
##    180        1.1912             nan     0.0010    0.0003
##    200        1.1790             nan     0.0010    0.0002
##    220        1.1672             nan     0.0010    0.0003
##    240        1.1558             nan     0.0010    0.0003
##    260        1.1446             nan     0.0010    0.0002
##    280        1.1338             nan     0.0010    0.0002
##    300        1.1232             nan     0.0010    0.0002
##    320        1.1131             nan     0.0010    0.0002
##    340        1.1033             nan     0.0010    0.0002
##    360        1.0936             nan     0.0010    0.0002
##    380        1.0841             nan     0.0010    0.0002
##    400        1.0750             nan     0.0010    0.0002
##    420        1.0661             nan     0.0010    0.0002
##    440        1.0575             nan     0.0010    0.0002
##    460        1.0491             nan     0.0010    0.0001
##    480        1.0410             nan     0.0010    0.0001
##    500        1.0332             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0003
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3043             nan     0.0010    0.0004
##     40        1.2882             nan     0.0010    0.0004
##     60        1.2729             nan     0.0010    0.0003
##     80        1.2583             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2309             nan     0.0010    0.0002
##    140        1.2174             nan     0.0010    0.0003
##    160        1.2044             nan     0.0010    0.0003
##    180        1.1916             nan     0.0010    0.0003
##    200        1.1793             nan     0.0010    0.0003
##    220        1.1679             nan     0.0010    0.0002
##    240        1.1567             nan     0.0010    0.0002
##    260        1.1455             nan     0.0010    0.0002
##    280        1.1349             nan     0.0010    0.0002
##    300        1.1246             nan     0.0010    0.0002
##    320        1.1147             nan     0.0010    0.0002
##    340        1.1047             nan     0.0010    0.0002
##    360        1.0951             nan     0.0010    0.0002
##    380        1.0853             nan     0.0010    0.0002
##    400        1.0764             nan     0.0010    0.0002
##    420        1.0677             nan     0.0010    0.0002
##    440        1.0590             nan     0.0010    0.0002
##    460        1.0505             nan     0.0010    0.0002
##    480        1.0422             nan     0.0010    0.0002
##    500        1.0342             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0003
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0003
##      4        1.3174             nan     0.0010    0.0003
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0003
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3126             nan     0.0010    0.0003
##     20        1.3044             nan     0.0010    0.0004
##     40        1.2886             nan     0.0010    0.0004
##     60        1.2736             nan     0.0010    0.0004
##     80        1.2590             nan     0.0010    0.0004
##    100        1.2454             nan     0.0010    0.0003
##    120        1.2316             nan     0.0010    0.0003
##    140        1.2185             nan     0.0010    0.0003
##    160        1.2059             nan     0.0010    0.0003
##    180        1.1934             nan     0.0010    0.0003
##    200        1.1814             nan     0.0010    0.0003
##    220        1.1696             nan     0.0010    0.0003
##    240        1.1583             nan     0.0010    0.0002
##    260        1.1473             nan     0.0010    0.0002
##    280        1.1367             nan     0.0010    0.0002
##    300        1.1263             nan     0.0010    0.0002
##    320        1.1161             nan     0.0010    0.0002
##    340        1.1062             nan     0.0010    0.0002
##    360        1.0967             nan     0.0010    0.0002
##    380        1.0872             nan     0.0010    0.0002
##    400        1.0782             nan     0.0010    0.0002
##    420        1.0695             nan     0.0010    0.0001
##    440        1.0608             nan     0.0010    0.0002
##    460        1.0523             nan     0.0010    0.0002
##    480        1.0441             nan     0.0010    0.0001
##    500        1.0362             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0003
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3146             nan     0.0010    0.0003
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3032             nan     0.0010    0.0004
##     40        1.2866             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0004
##     80        1.2547             nan     0.0010    0.0004
##    100        1.2395             nan     0.0010    0.0004
##    120        1.2252             nan     0.0010    0.0003
##    140        1.2109             nan     0.0010    0.0003
##    160        1.1971             nan     0.0010    0.0003
##    180        1.1838             nan     0.0010    0.0002
##    200        1.1709             nan     0.0010    0.0003
##    220        1.1586             nan     0.0010    0.0003
##    240        1.1468             nan     0.0010    0.0003
##    260        1.1351             nan     0.0010    0.0002
##    280        1.1236             nan     0.0010    0.0003
##    300        1.1124             nan     0.0010    0.0002
##    320        1.1015             nan     0.0010    0.0003
##    340        1.0907             nan     0.0010    0.0002
##    360        1.0804             nan     0.0010    0.0002
##    380        1.0703             nan     0.0010    0.0002
##    400        1.0606             nan     0.0010    0.0002
##    420        1.0513             nan     0.0010    0.0002
##    440        1.0422             nan     0.0010    0.0002
##    460        1.0333             nan     0.0010    0.0002
##    480        1.0246             nan     0.0010    0.0002
##    500        1.0162             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2860             nan     0.0010    0.0004
##     60        1.2696             nan     0.0010    0.0004
##     80        1.2539             nan     0.0010    0.0004
##    100        1.2392             nan     0.0010    0.0003
##    120        1.2245             nan     0.0010    0.0003
##    140        1.2103             nan     0.0010    0.0003
##    160        1.1965             nan     0.0010    0.0003
##    180        1.1831             nan     0.0010    0.0003
##    200        1.1703             nan     0.0010    0.0003
##    220        1.1579             nan     0.0010    0.0002
##    240        1.1458             nan     0.0010    0.0002
##    260        1.1340             nan     0.0010    0.0003
##    280        1.1225             nan     0.0010    0.0003
##    300        1.1117             nan     0.0010    0.0002
##    320        1.1012             nan     0.0010    0.0002
##    340        1.0906             nan     0.0010    0.0003
##    360        1.0803             nan     0.0010    0.0002
##    380        1.0707             nan     0.0010    0.0002
##    400        1.0610             nan     0.0010    0.0002
##    420        1.0516             nan     0.0010    0.0002
##    440        1.0428             nan     0.0010    0.0002
##    460        1.0342             nan     0.0010    0.0002
##    480        1.0256             nan     0.0010    0.0002
##    500        1.0173             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3128             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2867             nan     0.0010    0.0004
##     60        1.2704             nan     0.0010    0.0004
##     80        1.2551             nan     0.0010    0.0003
##    100        1.2401             nan     0.0010    0.0003
##    120        1.2254             nan     0.0010    0.0003
##    140        1.2116             nan     0.0010    0.0002
##    160        1.1980             nan     0.0010    0.0003
##    180        1.1848             nan     0.0010    0.0003
##    200        1.1721             nan     0.0010    0.0003
##    220        1.1598             nan     0.0010    0.0003
##    240        1.1478             nan     0.0010    0.0003
##    260        1.1359             nan     0.0010    0.0003
##    280        1.1250             nan     0.0010    0.0003
##    300        1.1140             nan     0.0010    0.0002
##    320        1.1038             nan     0.0010    0.0002
##    340        1.0935             nan     0.0010    0.0002
##    360        1.0832             nan     0.0010    0.0002
##    380        1.0733             nan     0.0010    0.0002
##    400        1.0637             nan     0.0010    0.0002
##    420        1.0544             nan     0.0010    0.0002
##    440        1.0454             nan     0.0010    0.0002
##    460        1.0365             nan     0.0010    0.0002
##    480        1.0278             nan     0.0010    0.0002
##    500        1.0196             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3141             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3122             nan     0.0010    0.0004
##     10        1.3114             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2846             nan     0.0010    0.0004
##     60        1.2677             nan     0.0010    0.0004
##     80        1.2512             nan     0.0010    0.0004
##    100        1.2352             nan     0.0010    0.0004
##    120        1.2199             nan     0.0010    0.0003
##    140        1.2051             nan     0.0010    0.0004
##    160        1.1906             nan     0.0010    0.0003
##    180        1.1768             nan     0.0010    0.0003
##    200        1.1631             nan     0.0010    0.0003
##    220        1.1499             nan     0.0010    0.0003
##    240        1.1373             nan     0.0010    0.0003
##    260        1.1249             nan     0.0010    0.0003
##    280        1.1129             nan     0.0010    0.0003
##    300        1.1015             nan     0.0010    0.0002
##    320        1.0903             nan     0.0010    0.0002
##    340        1.0794             nan     0.0010    0.0002
##    360        1.0689             nan     0.0010    0.0002
##    380        1.0586             nan     0.0010    0.0002
##    400        1.0484             nan     0.0010    0.0002
##    420        1.0384             nan     0.0010    0.0002
##    440        1.0290             nan     0.0010    0.0002
##    460        1.0196             nan     0.0010    0.0002
##    480        1.0108             nan     0.0010    0.0002
##    500        1.0021             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3179             nan     0.0010    0.0004
##      4        1.3170             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0005
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2851             nan     0.0010    0.0004
##     60        1.2684             nan     0.0010    0.0003
##     80        1.2522             nan     0.0010    0.0004
##    100        1.2363             nan     0.0010    0.0004
##    120        1.2208             nan     0.0010    0.0004
##    140        1.2059             nan     0.0010    0.0003
##    160        1.1915             nan     0.0010    0.0003
##    180        1.1775             nan     0.0010    0.0003
##    200        1.1641             nan     0.0010    0.0003
##    220        1.1511             nan     0.0010    0.0003
##    240        1.1387             nan     0.0010    0.0002
##    260        1.1265             nan     0.0010    0.0003
##    280        1.1145             nan     0.0010    0.0003
##    300        1.1031             nan     0.0010    0.0002
##    320        1.0919             nan     0.0010    0.0002
##    340        1.0809             nan     0.0010    0.0002
##    360        1.0703             nan     0.0010    0.0002
##    380        1.0600             nan     0.0010    0.0002
##    400        1.0501             nan     0.0010    0.0002
##    420        1.0403             nan     0.0010    0.0002
##    440        1.0307             nan     0.0010    0.0002
##    460        1.0216             nan     0.0010    0.0002
##    480        1.0126             nan     0.0010    0.0002
##    500        1.0037             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3152             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2854             nan     0.0010    0.0004
##     60        1.2685             nan     0.0010    0.0004
##     80        1.2523             nan     0.0010    0.0003
##    100        1.2368             nan     0.0010    0.0003
##    120        1.2217             nan     0.0010    0.0003
##    140        1.2073             nan     0.0010    0.0003
##    160        1.1928             nan     0.0010    0.0003
##    180        1.1790             nan     0.0010    0.0003
##    200        1.1657             nan     0.0010    0.0002
##    220        1.1528             nan     0.0010    0.0003
##    240        1.1405             nan     0.0010    0.0002
##    260        1.1283             nan     0.0010    0.0003
##    280        1.1165             nan     0.0010    0.0003
##    300        1.1051             nan     0.0010    0.0002
##    320        1.0942             nan     0.0010    0.0003
##    340        1.0836             nan     0.0010    0.0002
##    360        1.0730             nan     0.0010    0.0002
##    380        1.0629             nan     0.0010    0.0002
##    400        1.0529             nan     0.0010    0.0002
##    420        1.0432             nan     0.0010    0.0002
##    440        1.0338             nan     0.0010    0.0002
##    460        1.0246             nan     0.0010    0.0002
##    480        1.0159             nan     0.0010    0.0002
##    500        1.0070             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3127             nan     0.0100    0.0039
##      2        1.3044             nan     0.0100    0.0036
##      3        1.2966             nan     0.0100    0.0034
##      4        1.2881             nan     0.0100    0.0037
##      5        1.2799             nan     0.0100    0.0039
##      6        1.2729             nan     0.0100    0.0033
##      7        1.2658             nan     0.0100    0.0031
##      8        1.2583             nan     0.0100    0.0034
##      9        1.2509             nan     0.0100    0.0033
##     10        1.2438             nan     0.0100    0.0033
##     20        1.1806             nan     0.0100    0.0026
##     40        1.0759             nan     0.0100    0.0019
##     60        0.9953             nan     0.0100    0.0013
##     80        0.9336             nan     0.0100    0.0008
##    100        0.8835             nan     0.0100    0.0009
##    120        0.8425             nan     0.0100    0.0007
##    140        0.8094             nan     0.0100    0.0006
##    160        0.7804             nan     0.0100    0.0005
##    180        0.7568             nan     0.0100    0.0001
##    200        0.7348             nan     0.0100    0.0002
##    220        0.7169             nan     0.0100   -0.0001
##    240        0.6996             nan     0.0100   -0.0000
##    260        0.6828             nan     0.0100    0.0000
##    280        0.6679             nan     0.0100    0.0001
##    300        0.6532             nan     0.0100    0.0002
##    320        0.6408             nan     0.0100   -0.0000
##    340        0.6284             nan     0.0100   -0.0000
##    360        0.6172             nan     0.0100    0.0001
##    380        0.6061             nan     0.0100    0.0000
##    400        0.5961             nan     0.0100    0.0000
##    420        0.5872             nan     0.0100   -0.0001
##    440        0.5786             nan     0.0100   -0.0001
##    460        0.5701             nan     0.0100    0.0001
##    480        0.5606             nan     0.0100   -0.0001
##    500        0.5519             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0037
##      2        1.3041             nan     0.0100    0.0039
##      3        1.2955             nan     0.0100    0.0035
##      4        1.2879             nan     0.0100    0.0038
##      5        1.2796             nan     0.0100    0.0038
##      6        1.2723             nan     0.0100    0.0034
##      7        1.2653             nan     0.0100    0.0030
##      8        1.2581             nan     0.0100    0.0032
##      9        1.2503             nan     0.0100    0.0034
##     10        1.2428             nan     0.0100    0.0032
##     20        1.1784             nan     0.0100    0.0024
##     40        1.0732             nan     0.0100    0.0018
##     60        0.9941             nan     0.0100    0.0014
##     80        0.9330             nan     0.0100    0.0008
##    100        0.8827             nan     0.0100    0.0008
##    120        0.8422             nan     0.0100    0.0006
##    140        0.8079             nan     0.0100    0.0005
##    160        0.7794             nan     0.0100    0.0003
##    180        0.7559             nan     0.0100    0.0002
##    200        0.7348             nan     0.0100    0.0000
##    220        0.7166             nan     0.0100   -0.0000
##    240        0.6997             nan     0.0100    0.0001
##    260        0.6846             nan     0.0100    0.0000
##    280        0.6709             nan     0.0100    0.0000
##    300        0.6576             nan     0.0100   -0.0000
##    320        0.6462             nan     0.0100   -0.0000
##    340        0.6356             nan     0.0100    0.0000
##    360        0.6255             nan     0.0100   -0.0000
##    380        0.6155             nan     0.0100    0.0001
##    400        0.6069             nan     0.0100   -0.0000
##    420        0.5976             nan     0.0100   -0.0000
##    440        0.5881             nan     0.0100   -0.0002
##    460        0.5787             nan     0.0100   -0.0001
##    480        0.5701             nan     0.0100   -0.0000
##    500        0.5617             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0036
##      2        1.3038             nan     0.0100    0.0034
##      3        1.2964             nan     0.0100    0.0033
##      4        1.2888             nan     0.0100    0.0038
##      5        1.2811             nan     0.0100    0.0033
##      6        1.2732             nan     0.0100    0.0036
##      7        1.2663             nan     0.0100    0.0032
##      8        1.2590             nan     0.0100    0.0035
##      9        1.2514             nan     0.0100    0.0036
##     10        1.2443             nan     0.0100    0.0032
##     20        1.1790             nan     0.0100    0.0024
##     40        1.0771             nan     0.0100    0.0020
##     60        0.9993             nan     0.0100    0.0012
##     80        0.9368             nan     0.0100    0.0009
##    100        0.8878             nan     0.0100    0.0007
##    120        0.8470             nan     0.0100    0.0007
##    140        0.8130             nan     0.0100    0.0006
##    160        0.7853             nan     0.0100    0.0002
##    180        0.7612             nan     0.0100    0.0003
##    200        0.7401             nan     0.0100    0.0000
##    220        0.7220             nan     0.0100   -0.0001
##    240        0.7051             nan     0.0100   -0.0000
##    260        0.6901             nan     0.0100    0.0001
##    280        0.6767             nan     0.0100    0.0002
##    300        0.6645             nan     0.0100    0.0001
##    320        0.6540             nan     0.0100   -0.0001
##    340        0.6430             nan     0.0100   -0.0002
##    360        0.6320             nan     0.0100   -0.0001
##    380        0.6227             nan     0.0100    0.0000
##    400        0.6139             nan     0.0100   -0.0000
##    420        0.6046             nan     0.0100   -0.0001
##    440        0.5965             nan     0.0100   -0.0002
##    460        0.5881             nan     0.0100   -0.0000
##    480        0.5800             nan     0.0100    0.0000
##    500        0.5726             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0038
##      2        1.3026             nan     0.0100    0.0041
##      3        1.2937             nan     0.0100    0.0040
##      4        1.2852             nan     0.0100    0.0041
##      5        1.2770             nan     0.0100    0.0037
##      6        1.2695             nan     0.0100    0.0034
##      7        1.2612             nan     0.0100    0.0037
##      8        1.2533             nan     0.0100    0.0034
##      9        1.2450             nan     0.0100    0.0036
##     10        1.2372             nan     0.0100    0.0032
##     20        1.1681             nan     0.0100    0.0029
##     40        1.0579             nan     0.0100    0.0021
##     60        0.9747             nan     0.0100    0.0014
##     80        0.9091             nan     0.0100    0.0010
##    100        0.8572             nan     0.0100    0.0006
##    120        0.8145             nan     0.0100    0.0005
##    140        0.7789             nan     0.0100    0.0003
##    160        0.7486             nan     0.0100    0.0003
##    180        0.7211             nan     0.0100    0.0004
##    200        0.6982             nan     0.0100    0.0003
##    220        0.6780             nan     0.0100    0.0003
##    240        0.6594             nan     0.0100   -0.0000
##    260        0.6423             nan     0.0100    0.0001
##    280        0.6272             nan     0.0100    0.0001
##    300        0.6137             nan     0.0100    0.0001
##    320        0.6000             nan     0.0100    0.0000
##    340        0.5879             nan     0.0100   -0.0001
##    360        0.5752             nan     0.0100    0.0001
##    380        0.5632             nan     0.0100    0.0000
##    400        0.5521             nan     0.0100   -0.0000
##    420        0.5418             nan     0.0100   -0.0002
##    440        0.5318             nan     0.0100   -0.0001
##    460        0.5221             nan     0.0100   -0.0001
##    480        0.5123             nan     0.0100   -0.0001
##    500        0.5025             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3122             nan     0.0100    0.0033
##      2        1.3035             nan     0.0100    0.0038
##      3        1.2957             nan     0.0100    0.0036
##      4        1.2875             nan     0.0100    0.0036
##      5        1.2788             nan     0.0100    0.0034
##      6        1.2710             nan     0.0100    0.0038
##      7        1.2631             nan     0.0100    0.0035
##      8        1.2546             nan     0.0100    0.0038
##      9        1.2469             nan     0.0100    0.0035
##     10        1.2394             nan     0.0100    0.0035
##     20        1.1710             nan     0.0100    0.0025
##     40        1.0597             nan     0.0100    0.0019
##     60        0.9785             nan     0.0100    0.0014
##     80        0.9136             nan     0.0100    0.0009
##    100        0.8612             nan     0.0100    0.0010
##    120        0.8190             nan     0.0100    0.0008
##    140        0.7841             nan     0.0100    0.0003
##    160        0.7521             nan     0.0100    0.0003
##    180        0.7257             nan     0.0100    0.0004
##    200        0.7041             nan     0.0100    0.0003
##    220        0.6831             nan     0.0100    0.0000
##    240        0.6662             nan     0.0100    0.0002
##    260        0.6499             nan     0.0100    0.0002
##    280        0.6352             nan     0.0100    0.0001
##    300        0.6216             nan     0.0100   -0.0000
##    320        0.6084             nan     0.0100    0.0002
##    340        0.5960             nan     0.0100    0.0000
##    360        0.5846             nan     0.0100    0.0000
##    380        0.5733             nan     0.0100    0.0000
##    400        0.5614             nan     0.0100   -0.0001
##    420        0.5509             nan     0.0100   -0.0001
##    440        0.5404             nan     0.0100   -0.0000
##    460        0.5305             nan     0.0100   -0.0000
##    480        0.5211             nan     0.0100    0.0000
##    500        0.5126             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0043
##      2        1.3036             nan     0.0100    0.0041
##      3        1.2952             nan     0.0100    0.0040
##      4        1.2869             nan     0.0100    0.0035
##      5        1.2790             nan     0.0100    0.0033
##      6        1.2710             nan     0.0100    0.0035
##      7        1.2628             nan     0.0100    0.0035
##      8        1.2551             nan     0.0100    0.0031
##      9        1.2478             nan     0.0100    0.0033
##     10        1.2406             nan     0.0100    0.0035
##     20        1.1728             nan     0.0100    0.0025
##     40        1.0644             nan     0.0100    0.0021
##     60        0.9824             nan     0.0100    0.0015
##     80        0.9185             nan     0.0100    0.0010
##    100        0.8682             nan     0.0100    0.0007
##    120        0.8245             nan     0.0100    0.0008
##    140        0.7895             nan     0.0100    0.0004
##    160        0.7593             nan     0.0100    0.0005
##    180        0.7336             nan     0.0100    0.0002
##    200        0.7122             nan     0.0100    0.0002
##    220        0.6917             nan     0.0100    0.0003
##    240        0.6735             nan     0.0100    0.0002
##    260        0.6570             nan     0.0100    0.0001
##    280        0.6430             nan     0.0100   -0.0002
##    300        0.6292             nan     0.0100    0.0002
##    320        0.6169             nan     0.0100   -0.0002
##    340        0.6048             nan     0.0100    0.0001
##    360        0.5925             nan     0.0100   -0.0001
##    380        0.5815             nan     0.0100    0.0000
##    400        0.5695             nan     0.0100   -0.0001
##    420        0.5594             nan     0.0100    0.0000
##    440        0.5499             nan     0.0100   -0.0001
##    460        0.5410             nan     0.0100   -0.0002
##    480        0.5321             nan     0.0100   -0.0001
##    500        0.5224             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0044
##      2        1.3019             nan     0.0100    0.0041
##      3        1.2931             nan     0.0100    0.0041
##      4        1.2844             nan     0.0100    0.0042
##      5        1.2761             nan     0.0100    0.0039
##      6        1.2680             nan     0.0100    0.0035
##      7        1.2599             nan     0.0100    0.0038
##      8        1.2519             nan     0.0100    0.0035
##      9        1.2439             nan     0.0100    0.0039
##     10        1.2358             nan     0.0100    0.0036
##     20        1.1632             nan     0.0100    0.0030
##     40        1.0480             nan     0.0100    0.0021
##     60        0.9606             nan     0.0100    0.0016
##     80        0.8928             nan     0.0100    0.0013
##    100        0.8379             nan     0.0100    0.0006
##    120        0.7924             nan     0.0100    0.0004
##    140        0.7549             nan     0.0100    0.0003
##    160        0.7229             nan     0.0100    0.0002
##    180        0.6939             nan     0.0100    0.0004
##    200        0.6695             nan     0.0100    0.0001
##    220        0.6468             nan     0.0100   -0.0000
##    240        0.6271             nan     0.0100    0.0000
##    260        0.6099             nan     0.0100    0.0001
##    280        0.5935             nan     0.0100    0.0001
##    300        0.5777             nan     0.0100    0.0000
##    320        0.5639             nan     0.0100    0.0000
##    340        0.5503             nan     0.0100    0.0002
##    360        0.5370             nan     0.0100    0.0001
##    380        0.5239             nan     0.0100    0.0000
##    400        0.5120             nan     0.0100   -0.0000
##    420        0.5003             nan     0.0100   -0.0002
##    440        0.4903             nan     0.0100   -0.0001
##    460        0.4794             nan     0.0100    0.0000
##    480        0.4699             nan     0.0100   -0.0001
##    500        0.4602             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3111             nan     0.0100    0.0043
##      2        1.3030             nan     0.0100    0.0034
##      3        1.2950             nan     0.0100    0.0038
##      4        1.2860             nan     0.0100    0.0041
##      5        1.2775             nan     0.0100    0.0035
##      6        1.2688             nan     0.0100    0.0038
##      7        1.2610             nan     0.0100    0.0037
##      8        1.2532             nan     0.0100    0.0033
##      9        1.2455             nan     0.0100    0.0036
##     10        1.2374             nan     0.0100    0.0037
##     20        1.1649             nan     0.0100    0.0030
##     40        1.0516             nan     0.0100    0.0020
##     60        0.9633             nan     0.0100    0.0015
##     80        0.8936             nan     0.0100    0.0013
##    100        0.8396             nan     0.0100    0.0007
##    120        0.7955             nan     0.0100    0.0008
##    140        0.7576             nan     0.0100    0.0004
##    160        0.7269             nan     0.0100    0.0003
##    180        0.6992             nan     0.0100    0.0001
##    200        0.6766             nan     0.0100    0.0001
##    220        0.6546             nan     0.0100    0.0000
##    240        0.6378             nan     0.0100   -0.0001
##    260        0.6203             nan     0.0100    0.0001
##    280        0.6049             nan     0.0100    0.0002
##    300        0.5900             nan     0.0100    0.0000
##    320        0.5762             nan     0.0100    0.0001
##    340        0.5622             nan     0.0100   -0.0000
##    360        0.5495             nan     0.0100   -0.0000
##    380        0.5372             nan     0.0100    0.0000
##    400        0.5250             nan     0.0100   -0.0000
##    420        0.5130             nan     0.0100   -0.0000
##    440        0.5020             nan     0.0100    0.0000
##    460        0.4911             nan     0.0100    0.0000
##    480        0.4808             nan     0.0100   -0.0000
##    500        0.4709             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0041
##      2        1.3033             nan     0.0100    0.0039
##      3        1.2943             nan     0.0100    0.0040
##      4        1.2853             nan     0.0100    0.0039
##      5        1.2764             nan     0.0100    0.0040
##      6        1.2679             nan     0.0100    0.0040
##      7        1.2599             nan     0.0100    0.0032
##      8        1.2519             nan     0.0100    0.0037
##      9        1.2440             nan     0.0100    0.0033
##     10        1.2362             nan     0.0100    0.0036
##     20        1.1650             nan     0.0100    0.0029
##     40        1.0533             nan     0.0100    0.0019
##     60        0.9671             nan     0.0100    0.0016
##     80        0.9022             nan     0.0100    0.0010
##    100        0.8470             nan     0.0100    0.0010
##    120        0.8027             nan     0.0100    0.0007
##    140        0.7661             nan     0.0100    0.0005
##    160        0.7356             nan     0.0100    0.0004
##    180        0.7083             nan     0.0100    0.0003
##    200        0.6851             nan     0.0100    0.0003
##    220        0.6647             nan     0.0100    0.0000
##    240        0.6458             nan     0.0100    0.0000
##    260        0.6284             nan     0.0100    0.0001
##    280        0.6124             nan     0.0100    0.0001
##    300        0.5969             nan     0.0100    0.0000
##    320        0.5812             nan     0.0100    0.0000
##    340        0.5674             nan     0.0100   -0.0000
##    360        0.5548             nan     0.0100   -0.0001
##    380        0.5433             nan     0.0100   -0.0000
##    400        0.5323             nan     0.0100   -0.0001
##    420        0.5214             nan     0.0100   -0.0001
##    440        0.5104             nan     0.0100   -0.0002
##    460        0.5007             nan     0.0100    0.0000
##    480        0.4913             nan     0.0100   -0.0001
##    500        0.4809             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2403             nan     0.1000    0.0394
##      2        1.1743             nan     0.1000    0.0312
##      3        1.1206             nan     0.1000    0.0246
##      4        1.0750             nan     0.1000    0.0193
##      5        1.0319             nan     0.1000    0.0182
##      6        0.9978             nan     0.1000    0.0139
##      7        0.9649             nan     0.1000    0.0126
##      8        0.9335             nan     0.1000    0.0115
##      9        0.9061             nan     0.1000    0.0103
##     10        0.8819             nan     0.1000    0.0089
##     20        0.7356             nan     0.1000    0.0033
##     40        0.6047             nan     0.1000   -0.0006
##     60        0.5215             nan     0.1000   -0.0015
##     80        0.4611             nan     0.1000   -0.0011
##    100        0.4097             nan     0.1000    0.0003
##    120        0.3647             nan     0.1000   -0.0004
##    140        0.3256             nan     0.1000   -0.0010
##    160        0.2962             nan     0.1000   -0.0015
##    180        0.2685             nan     0.1000   -0.0002
##    200        0.2436             nan     0.1000   -0.0002
##    220        0.2213             nan     0.1000   -0.0005
##    240        0.2030             nan     0.1000   -0.0006
##    260        0.1868             nan     0.1000   -0.0004
##    280        0.1725             nan     0.1000   -0.0002
##    300        0.1588             nan     0.1000   -0.0003
##    320        0.1458             nan     0.1000   -0.0002
##    340        0.1346             nan     0.1000   -0.0002
##    360        0.1227             nan     0.1000   -0.0001
##    380        0.1134             nan     0.1000   -0.0003
##    400        0.1043             nan     0.1000    0.0002
##    420        0.0963             nan     0.1000   -0.0002
##    440        0.0888             nan     0.1000   -0.0004
##    460        0.0820             nan     0.1000   -0.0000
##    480        0.0758             nan     0.1000   -0.0002
##    500        0.0701             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2509             nan     0.1000    0.0334
##      2        1.1864             nan     0.1000    0.0265
##      3        1.1247             nan     0.1000    0.0266
##      4        1.0757             nan     0.1000    0.0211
##      5        1.0304             nan     0.1000    0.0198
##      6        0.9920             nan     0.1000    0.0164
##      7        0.9586             nan     0.1000    0.0097
##      8        0.9312             nan     0.1000    0.0104
##      9        0.9080             nan     0.1000    0.0079
##     10        0.8838             nan     0.1000    0.0092
##     20        0.7409             nan     0.1000    0.0001
##     40        0.6120             nan     0.1000   -0.0013
##     60        0.5359             nan     0.1000   -0.0017
##     80        0.4693             nan     0.1000   -0.0009
##    100        0.4200             nan     0.1000   -0.0011
##    120        0.3749             nan     0.1000   -0.0011
##    140        0.3340             nan     0.1000   -0.0002
##    160        0.3011             nan     0.1000   -0.0004
##    180        0.2701             nan     0.1000   -0.0006
##    200        0.2468             nan     0.1000   -0.0002
##    220        0.2284             nan     0.1000   -0.0010
##    240        0.2086             nan     0.1000   -0.0007
##    260        0.1923             nan     0.1000   -0.0005
##    280        0.1757             nan     0.1000   -0.0003
##    300        0.1617             nan     0.1000   -0.0007
##    320        0.1493             nan     0.1000   -0.0006
##    340        0.1386             nan     0.1000   -0.0006
##    360        0.1261             nan     0.1000   -0.0005
##    380        0.1172             nan     0.1000   -0.0003
##    400        0.1083             nan     0.1000   -0.0002
##    420        0.1008             nan     0.1000   -0.0002
##    440        0.0931             nan     0.1000   -0.0003
##    460        0.0852             nan     0.1000   -0.0000
##    480        0.0788             nan     0.1000   -0.0003
##    500        0.0730             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2342             nan     0.1000    0.0371
##      2        1.1699             nan     0.1000    0.0306
##      3        1.1148             nan     0.1000    0.0245
##      4        1.0653             nan     0.1000    0.0239
##      5        1.0270             nan     0.1000    0.0159
##      6        0.9959             nan     0.1000    0.0146
##      7        0.9591             nan     0.1000    0.0152
##      8        0.9299             nan     0.1000    0.0109
##      9        0.9045             nan     0.1000    0.0115
##     10        0.8811             nan     0.1000    0.0076
##     20        0.7402             nan     0.1000    0.0020
##     40        0.6166             nan     0.1000    0.0006
##     60        0.5365             nan     0.1000   -0.0014
##     80        0.4775             nan     0.1000   -0.0010
##    100        0.4262             nan     0.1000   -0.0007
##    120        0.3816             nan     0.1000   -0.0012
##    140        0.3423             nan     0.1000   -0.0006
##    160        0.3105             nan     0.1000   -0.0002
##    180        0.2838             nan     0.1000   -0.0025
##    200        0.2586             nan     0.1000   -0.0018
##    220        0.2361             nan     0.1000   -0.0010
##    240        0.2146             nan     0.1000   -0.0004
##    260        0.1963             nan     0.1000   -0.0002
##    280        0.1811             nan     0.1000   -0.0005
##    300        0.1683             nan     0.1000   -0.0002
##    320        0.1544             nan     0.1000   -0.0002
##    340        0.1416             nan     0.1000   -0.0008
##    360        0.1319             nan     0.1000   -0.0004
##    380        0.1218             nan     0.1000   -0.0003
##    400        0.1137             nan     0.1000   -0.0004
##    420        0.1058             nan     0.1000   -0.0003
##    440        0.0988             nan     0.1000   -0.0002
##    460        0.0911             nan     0.1000   -0.0003
##    480        0.0853             nan     0.1000   -0.0005
##    500        0.0796             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2240             nan     0.1000    0.0416
##      2        1.1543             nan     0.1000    0.0325
##      3        1.0932             nan     0.1000    0.0246
##      4        1.0470             nan     0.1000    0.0190
##      5        1.0070             nan     0.1000    0.0153
##      6        0.9696             nan     0.1000    0.0177
##      7        0.9321             nan     0.1000    0.0131
##      8        0.9023             nan     0.1000    0.0113
##      9        0.8739             nan     0.1000    0.0121
##     10        0.8479             nan     0.1000    0.0099
##     20        0.6962             nan     0.1000    0.0026
##     40        0.5499             nan     0.1000    0.0006
##     60        0.4632             nan     0.1000   -0.0001
##     80        0.3970             nan     0.1000   -0.0007
##    100        0.3441             nan     0.1000   -0.0005
##    120        0.3019             nan     0.1000   -0.0006
##    140        0.2683             nan     0.1000   -0.0006
##    160        0.2363             nan     0.1000   -0.0014
##    180        0.2111             nan     0.1000   -0.0014
##    200        0.1883             nan     0.1000    0.0001
##    220        0.1645             nan     0.1000   -0.0004
##    240        0.1472             nan     0.1000   -0.0003
##    260        0.1319             nan     0.1000   -0.0002
##    280        0.1177             nan     0.1000   -0.0001
##    300        0.1064             nan     0.1000   -0.0005
##    320        0.0968             nan     0.1000   -0.0002
##    340        0.0870             nan     0.1000   -0.0003
##    360        0.0782             nan     0.1000   -0.0000
##    380        0.0711             nan     0.1000   -0.0002
##    400        0.0641             nan     0.1000   -0.0001
##    420        0.0588             nan     0.1000    0.0000
##    440        0.0529             nan     0.1000   -0.0000
##    460        0.0485             nan     0.1000   -0.0001
##    480        0.0436             nan     0.1000    0.0000
##    500        0.0395             nan     0.1000    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2341             nan     0.1000    0.0387
##      2        1.1704             nan     0.1000    0.0278
##      3        1.1123             nan     0.1000    0.0270
##      4        1.0574             nan     0.1000    0.0219
##      5        1.0140             nan     0.1000    0.0174
##      6        0.9747             nan     0.1000    0.0181
##      7        0.9440             nan     0.1000    0.0106
##      8        0.9158             nan     0.1000    0.0102
##      9        0.8895             nan     0.1000    0.0076
##     10        0.8649             nan     0.1000    0.0092
##     20        0.7068             nan     0.1000    0.0024
##     40        0.5659             nan     0.1000   -0.0015
##     60        0.4757             nan     0.1000   -0.0008
##     80        0.4002             nan     0.1000   -0.0015
##    100        0.3511             nan     0.1000   -0.0009
##    120        0.3024             nan     0.1000   -0.0004
##    140        0.2657             nan     0.1000   -0.0010
##    160        0.2384             nan     0.1000   -0.0005
##    180        0.2101             nan     0.1000   -0.0011
##    200        0.1884             nan     0.1000   -0.0007
##    220        0.1676             nan     0.1000   -0.0006
##    240        0.1497             nan     0.1000   -0.0010
##    260        0.1351             nan     0.1000   -0.0006
##    280        0.1207             nan     0.1000   -0.0003
##    300        0.1095             nan     0.1000   -0.0007
##    320        0.0987             nan     0.1000   -0.0003
##    340        0.0892             nan     0.1000   -0.0005
##    360        0.0813             nan     0.1000   -0.0002
##    380        0.0734             nan     0.1000   -0.0004
##    400        0.0671             nan     0.1000   -0.0002
##    420        0.0614             nan     0.1000   -0.0002
##    440        0.0557             nan     0.1000   -0.0003
##    460        0.0504             nan     0.1000   -0.0002
##    480        0.0459             nan     0.1000   -0.0001
##    500        0.0412             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2370             nan     0.1000    0.0362
##      2        1.1707             nan     0.1000    0.0342
##      3        1.1115             nan     0.1000    0.0264
##      4        1.0620             nan     0.1000    0.0233
##      5        1.0164             nan     0.1000    0.0217
##      6        0.9767             nan     0.1000    0.0165
##      7        0.9401             nan     0.1000    0.0141
##      8        0.9081             nan     0.1000    0.0113
##      9        0.8805             nan     0.1000    0.0121
##     10        0.8580             nan     0.1000    0.0084
##     20        0.7049             nan     0.1000    0.0035
##     40        0.5691             nan     0.1000    0.0000
##     60        0.4878             nan     0.1000   -0.0007
##     80        0.4201             nan     0.1000    0.0001
##    100        0.3689             nan     0.1000   -0.0017
##    120        0.3226             nan     0.1000   -0.0012
##    140        0.2825             nan     0.1000   -0.0004
##    160        0.2475             nan     0.1000   -0.0005
##    180        0.2191             nan     0.1000   -0.0003
##    200        0.1944             nan     0.1000   -0.0007
##    220        0.1735             nan     0.1000   -0.0002
##    240        0.1563             nan     0.1000   -0.0009
##    260        0.1401             nan     0.1000   -0.0006
##    280        0.1277             nan     0.1000   -0.0001
##    300        0.1149             nan     0.1000   -0.0004
##    320        0.1047             nan     0.1000   -0.0003
##    340        0.0944             nan     0.1000   -0.0004
##    360        0.0852             nan     0.1000   -0.0002
##    380        0.0772             nan     0.1000   -0.0003
##    400        0.0697             nan     0.1000   -0.0003
##    420        0.0637             nan     0.1000   -0.0003
##    440        0.0584             nan     0.1000   -0.0003
##    460        0.0535             nan     0.1000   -0.0002
##    480        0.0486             nan     0.1000   -0.0001
##    500        0.0443             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2295             nan     0.1000    0.0426
##      2        1.1476             nan     0.1000    0.0343
##      3        1.0883             nan     0.1000    0.0260
##      4        1.0358             nan     0.1000    0.0209
##      5        0.9949             nan     0.1000    0.0174
##      6        0.9568             nan     0.1000    0.0160
##      7        0.9169             nan     0.1000    0.0151
##      8        0.8862             nan     0.1000    0.0135
##      9        0.8568             nan     0.1000    0.0125
##     10        0.8307             nan     0.1000    0.0074
##     20        0.6742             nan     0.1000    0.0003
##     40        0.5148             nan     0.1000   -0.0013
##     60        0.4146             nan     0.1000    0.0005
##     80        0.3489             nan     0.1000   -0.0006
##    100        0.2939             nan     0.1000   -0.0013
##    120        0.2480             nan     0.1000   -0.0012
##    140        0.2121             nan     0.1000   -0.0001
##    160        0.1834             nan     0.1000   -0.0001
##    180        0.1580             nan     0.1000   -0.0004
##    200        0.1371             nan     0.1000   -0.0001
##    220        0.1170             nan     0.1000   -0.0002
##    240        0.1032             nan     0.1000   -0.0001
##    260        0.0917             nan     0.1000   -0.0003
##    280        0.0816             nan     0.1000    0.0000
##    300        0.0726             nan     0.1000   -0.0002
##    320        0.0637             nan     0.1000   -0.0001
##    340        0.0560             nan     0.1000   -0.0002
##    360        0.0498             nan     0.1000   -0.0001
##    380        0.0441             nan     0.1000   -0.0000
##    400        0.0394             nan     0.1000   -0.0000
##    420        0.0353             nan     0.1000   -0.0000
##    440        0.0313             nan     0.1000   -0.0001
##    460        0.0278             nan     0.1000   -0.0001
##    480        0.0251             nan     0.1000    0.0000
##    500        0.0224             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2303             nan     0.1000    0.0356
##      2        1.1605             nan     0.1000    0.0299
##      3        1.0959             nan     0.1000    0.0302
##      4        1.0394             nan     0.1000    0.0224
##      5        0.9968             nan     0.1000    0.0187
##      6        0.9569             nan     0.1000    0.0177
##      7        0.9254             nan     0.1000    0.0114
##      8        0.8951             nan     0.1000    0.0112
##      9        0.8689             nan     0.1000    0.0094
##     10        0.8404             nan     0.1000    0.0106
##     20        0.6839             nan     0.1000    0.0009
##     40        0.5214             nan     0.1000    0.0004
##     60        0.4227             nan     0.1000   -0.0018
##     80        0.3522             nan     0.1000   -0.0008
##    100        0.3014             nan     0.1000    0.0001
##    120        0.2602             nan     0.1000   -0.0004
##    140        0.2163             nan     0.1000   -0.0006
##    160        0.1862             nan     0.1000   -0.0009
##    180        0.1606             nan     0.1000    0.0000
##    200        0.1407             nan     0.1000   -0.0008
##    220        0.1222             nan     0.1000   -0.0002
##    240        0.1068             nan     0.1000   -0.0001
##    260        0.0941             nan     0.1000   -0.0005
##    280        0.0829             nan     0.1000   -0.0001
##    300        0.0730             nan     0.1000   -0.0002
##    320        0.0631             nan     0.1000   -0.0002
##    340        0.0563             nan     0.1000   -0.0002
##    360        0.0500             nan     0.1000   -0.0002
##    380        0.0442             nan     0.1000   -0.0002
##    400        0.0398             nan     0.1000   -0.0001
##    420        0.0353             nan     0.1000   -0.0002
##    440        0.0312             nan     0.1000   -0.0001
##    460        0.0279             nan     0.1000   -0.0001
##    480        0.0252             nan     0.1000   -0.0001
##    500        0.0226             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2321             nan     0.1000    0.0411
##      2        1.1604             nan     0.1000    0.0315
##      3        1.1016             nan     0.1000    0.0271
##      4        1.0512             nan     0.1000    0.0221
##      5        1.0049             nan     0.1000    0.0191
##      6        0.9673             nan     0.1000    0.0161
##      7        0.9309             nan     0.1000    0.0152
##      8        0.9066             nan     0.1000    0.0077
##      9        0.8819             nan     0.1000    0.0096
##     10        0.8556             nan     0.1000    0.0104
##     20        0.6852             nan     0.1000    0.0013
##     40        0.5429             nan     0.1000    0.0005
##     60        0.4519             nan     0.1000   -0.0017
##     80        0.3700             nan     0.1000   -0.0013
##    100        0.3108             nan     0.1000   -0.0003
##    120        0.2650             nan     0.1000   -0.0012
##    140        0.2295             nan     0.1000   -0.0010
##    160        0.2020             nan     0.1000   -0.0009
##    180        0.1750             nan     0.1000   -0.0008
##    200        0.1515             nan     0.1000   -0.0000
##    220        0.1328             nan     0.1000   -0.0003
##    240        0.1171             nan     0.1000   -0.0005
##    260        0.1030             nan     0.1000   -0.0004
##    280        0.0904             nan     0.1000   -0.0005
##    300        0.0808             nan     0.1000   -0.0001
##    320        0.0725             nan     0.1000   -0.0003
##    340        0.0647             nan     0.1000   -0.0002
##    360        0.0580             nan     0.1000   -0.0003
##    380        0.0519             nan     0.1000   -0.0004
##    400        0.0456             nan     0.1000   -0.0001
##    420        0.0408             nan     0.1000   -0.0001
##    440        0.0364             nan     0.1000   -0.0003
##    460        0.0330             nan     0.1000   -0.0001
##    480        0.0299             nan     0.1000   -0.0000
##    500        0.0268             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2697             nan     0.0010    0.0004
##     80        1.2535             nan     0.0010    0.0004
##    100        1.2386             nan     0.0010    0.0003
##    120        1.2237             nan     0.0010    0.0003
##    140        1.2093             nan     0.0010    0.0003
##    160        1.1954             nan     0.0010    0.0003
##    180        1.1820             nan     0.0010    0.0003
##    200        1.1691             nan     0.0010    0.0003
##    220        1.1561             nan     0.0010    0.0003
##    240        1.1437             nan     0.0010    0.0003
##    260        1.1317             nan     0.0010    0.0003
##    280        1.1200             nan     0.0010    0.0002
##    300        1.1086             nan     0.0010    0.0003
##    320        1.0977             nan     0.0010    0.0002
##    340        1.0870             nan     0.0010    0.0002
##    360        1.0766             nan     0.0010    0.0002
##    380        1.0665             nan     0.0010    0.0002
##    400        1.0565             nan     0.0010    0.0002
##    420        1.0469             nan     0.0010    0.0002
##    440        1.0374             nan     0.0010    0.0002
##    460        1.0283             nan     0.0010    0.0002
##    480        1.0195             nan     0.0010    0.0002
##    500        1.0107             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3148             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3122             nan     0.0010    0.0003
##     20        1.3033             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2702             nan     0.0010    0.0003
##     80        1.2543             nan     0.0010    0.0004
##    100        1.2389             nan     0.0010    0.0003
##    120        1.2236             nan     0.0010    0.0003
##    140        1.2092             nan     0.0010    0.0003
##    160        1.1952             nan     0.0010    0.0003
##    180        1.1817             nan     0.0010    0.0003
##    200        1.1685             nan     0.0010    0.0003
##    220        1.1560             nan     0.0010    0.0003
##    240        1.1437             nan     0.0010    0.0003
##    260        1.1319             nan     0.0010    0.0002
##    280        1.1200             nan     0.0010    0.0003
##    300        1.1087             nan     0.0010    0.0003
##    320        1.0976             nan     0.0010    0.0002
##    340        1.0868             nan     0.0010    0.0002
##    360        1.0766             nan     0.0010    0.0002
##    380        1.0664             nan     0.0010    0.0002
##    400        1.0565             nan     0.0010    0.0002
##    420        1.0469             nan     0.0010    0.0002
##    440        1.0377             nan     0.0010    0.0002
##    460        1.0288             nan     0.0010    0.0002
##    480        1.0198             nan     0.0010    0.0002
##    500        1.0110             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3124             nan     0.0010    0.0004
##     20        1.3038             nan     0.0010    0.0004
##     40        1.2871             nan     0.0010    0.0004
##     60        1.2710             nan     0.0010    0.0004
##     80        1.2554             nan     0.0010    0.0003
##    100        1.2403             nan     0.0010    0.0003
##    120        1.2259             nan     0.0010    0.0003
##    140        1.2117             nan     0.0010    0.0004
##    160        1.1977             nan     0.0010    0.0003
##    180        1.1845             nan     0.0010    0.0003
##    200        1.1714             nan     0.0010    0.0003
##    220        1.1585             nan     0.0010    0.0003
##    240        1.1461             nan     0.0010    0.0003
##    260        1.1342             nan     0.0010    0.0002
##    280        1.1228             nan     0.0010    0.0003
##    300        1.1115             nan     0.0010    0.0002
##    320        1.1006             nan     0.0010    0.0002
##    340        1.0901             nan     0.0010    0.0002
##    360        1.0798             nan     0.0010    0.0002
##    380        1.0697             nan     0.0010    0.0002
##    400        1.0597             nan     0.0010    0.0002
##    420        1.0501             nan     0.0010    0.0002
##    440        1.0409             nan     0.0010    0.0002
##    460        1.0320             nan     0.0010    0.0002
##    480        1.0230             nan     0.0010    0.0002
##    500        1.0143             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0005
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2851             nan     0.0010    0.0004
##     60        1.2677             nan     0.0010    0.0004
##     80        1.2512             nan     0.0010    0.0004
##    100        1.2348             nan     0.0010    0.0004
##    120        1.2188             nan     0.0010    0.0003
##    140        1.2034             nan     0.0010    0.0004
##    160        1.1886             nan     0.0010    0.0003
##    180        1.1747             nan     0.0010    0.0003
##    200        1.1609             nan     0.0010    0.0003
##    220        1.1473             nan     0.0010    0.0003
##    240        1.1344             nan     0.0010    0.0003
##    260        1.1220             nan     0.0010    0.0003
##    280        1.1096             nan     0.0010    0.0003
##    300        1.0977             nan     0.0010    0.0003
##    320        1.0861             nan     0.0010    0.0003
##    340        1.0746             nan     0.0010    0.0002
##    360        1.0635             nan     0.0010    0.0003
##    380        1.0528             nan     0.0010    0.0002
##    400        1.0423             nan     0.0010    0.0002
##    420        1.0322             nan     0.0010    0.0002
##    440        1.0224             nan     0.0010    0.0002
##    460        1.0126             nan     0.0010    0.0002
##    480        1.0033             nan     0.0010    0.0002
##    500        0.9941             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2848             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2508             nan     0.0010    0.0004
##    100        1.2347             nan     0.0010    0.0003
##    120        1.2190             nan     0.0010    0.0003
##    140        1.2038             nan     0.0010    0.0003
##    160        1.1890             nan     0.0010    0.0003
##    180        1.1747             nan     0.0010    0.0003
##    200        1.1609             nan     0.0010    0.0003
##    220        1.1473             nan     0.0010    0.0003
##    240        1.1344             nan     0.0010    0.0003
##    260        1.1221             nan     0.0010    0.0002
##    280        1.1099             nan     0.0010    0.0003
##    300        1.0979             nan     0.0010    0.0002
##    320        1.0862             nan     0.0010    0.0002
##    340        1.0746             nan     0.0010    0.0002
##    360        1.0635             nan     0.0010    0.0002
##    380        1.0530             nan     0.0010    0.0002
##    400        1.0426             nan     0.0010    0.0002
##    420        1.0325             nan     0.0010    0.0002
##    440        1.0228             nan     0.0010    0.0002
##    460        1.0133             nan     0.0010    0.0002
##    480        1.0039             nan     0.0010    0.0002
##    500        0.9948             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0004
##      7        1.3147             nan     0.0010    0.0004
##      8        1.3138             nan     0.0010    0.0004
##      9        1.3129             nan     0.0010    0.0004
##     10        1.3120             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2852             nan     0.0010    0.0004
##     60        1.2680             nan     0.0010    0.0003
##     80        1.2515             nan     0.0010    0.0003
##    100        1.2355             nan     0.0010    0.0004
##    120        1.2201             nan     0.0010    0.0004
##    140        1.2049             nan     0.0010    0.0003
##    160        1.1904             nan     0.0010    0.0003
##    180        1.1764             nan     0.0010    0.0003
##    200        1.1628             nan     0.0010    0.0003
##    220        1.1493             nan     0.0010    0.0003
##    240        1.1364             nan     0.0010    0.0003
##    260        1.1241             nan     0.0010    0.0003
##    280        1.1119             nan     0.0010    0.0003
##    300        1.1002             nan     0.0010    0.0003
##    320        1.0889             nan     0.0010    0.0003
##    340        1.0779             nan     0.0010    0.0002
##    360        1.0669             nan     0.0010    0.0002
##    380        1.0565             nan     0.0010    0.0003
##    400        1.0460             nan     0.0010    0.0002
##    420        1.0360             nan     0.0010    0.0002
##    440        1.0260             nan     0.0010    0.0002
##    460        1.0162             nan     0.0010    0.0002
##    480        1.0072             nan     0.0010    0.0002
##    500        0.9980             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0005
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0005
##     40        1.2830             nan     0.0010    0.0004
##     60        1.2646             nan     0.0010    0.0005
##     80        1.2467             nan     0.0010    0.0004
##    100        1.2300             nan     0.0010    0.0003
##    120        1.2134             nan     0.0010    0.0003
##    140        1.1974             nan     0.0010    0.0003
##    160        1.1820             nan     0.0010    0.0003
##    180        1.1672             nan     0.0010    0.0004
##    200        1.1528             nan     0.0010    0.0003
##    220        1.1389             nan     0.0010    0.0003
##    240        1.1251             nan     0.0010    0.0003
##    260        1.1119             nan     0.0010    0.0003
##    280        1.0992             nan     0.0010    0.0003
##    300        1.0869             nan     0.0010    0.0003
##    320        1.0746             nan     0.0010    0.0002
##    340        1.0630             nan     0.0010    0.0002
##    360        1.0517             nan     0.0010    0.0003
##    380        1.0407             nan     0.0010    0.0002
##    400        1.0296             nan     0.0010    0.0002
##    420        1.0193             nan     0.0010    0.0002
##    440        1.0090             nan     0.0010    0.0002
##    460        0.9989             nan     0.0010    0.0002
##    480        0.9890             nan     0.0010    0.0002
##    500        0.9799             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3154             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0005
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0005
##     40        1.2832             nan     0.0010    0.0004
##     60        1.2652             nan     0.0010    0.0004
##     80        1.2477             nan     0.0010    0.0004
##    100        1.2310             nan     0.0010    0.0004
##    120        1.2147             nan     0.0010    0.0004
##    140        1.1990             nan     0.0010    0.0004
##    160        1.1836             nan     0.0010    0.0004
##    180        1.1687             nan     0.0010    0.0003
##    200        1.1542             nan     0.0010    0.0003
##    220        1.1405             nan     0.0010    0.0003
##    240        1.1272             nan     0.0010    0.0003
##    260        1.1140             nan     0.0010    0.0003
##    280        1.1012             nan     0.0010    0.0003
##    300        1.0887             nan     0.0010    0.0003
##    320        1.0767             nan     0.0010    0.0003
##    340        1.0651             nan     0.0010    0.0002
##    360        1.0539             nan     0.0010    0.0002
##    380        1.0429             nan     0.0010    0.0002
##    400        1.0320             nan     0.0010    0.0002
##    420        1.0216             nan     0.0010    0.0002
##    440        1.0112             nan     0.0010    0.0002
##    460        1.0011             nan     0.0010    0.0002
##    480        0.9915             nan     0.0010    0.0002
##    500        0.9820             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0005
##      6        1.3155             nan     0.0010    0.0005
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0005
##      9        1.3124             nan     0.0010    0.0004
##     10        1.3115             nan     0.0010    0.0005
##     20        1.3019             nan     0.0010    0.0004
##     40        1.2838             nan     0.0010    0.0004
##     60        1.2659             nan     0.0010    0.0004
##     80        1.2485             nan     0.0010    0.0004
##    100        1.2318             nan     0.0010    0.0004
##    120        1.2156             nan     0.0010    0.0004
##    140        1.2001             nan     0.0010    0.0003
##    160        1.1852             nan     0.0010    0.0003
##    180        1.1709             nan     0.0010    0.0003
##    200        1.1567             nan     0.0010    0.0003
##    220        1.1430             nan     0.0010    0.0003
##    240        1.1299             nan     0.0010    0.0003
##    260        1.1172             nan     0.0010    0.0003
##    280        1.1047             nan     0.0010    0.0002
##    300        1.0925             nan     0.0010    0.0003
##    320        1.0806             nan     0.0010    0.0003
##    340        1.0691             nan     0.0010    0.0003
##    360        1.0580             nan     0.0010    0.0003
##    380        1.0471             nan     0.0010    0.0002
##    400        1.0364             nan     0.0010    0.0002
##    420        1.0261             nan     0.0010    0.0002
##    440        1.0161             nan     0.0010    0.0002
##    460        1.0064             nan     0.0010    0.0002
##    480        0.9968             nan     0.0010    0.0002
##    500        0.9877             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3118             nan     0.0100    0.0044
##      2        1.3035             nan     0.0100    0.0036
##      3        1.2947             nan     0.0100    0.0041
##      4        1.2864             nan     0.0100    0.0040
##      5        1.2779             nan     0.0100    0.0040
##      6        1.2694             nan     0.0100    0.0039
##      7        1.2612             nan     0.0100    0.0036
##      8        1.2535             nan     0.0100    0.0034
##      9        1.2462             nan     0.0100    0.0033
##     10        1.2387             nan     0.0100    0.0034
##     20        1.1676             nan     0.0100    0.0026
##     40        1.0537             nan     0.0100    0.0023
##     60        0.9691             nan     0.0100    0.0014
##     80        0.9022             nan     0.0100    0.0013
##    100        0.8471             nan     0.0100    0.0010
##    120        0.8030             nan     0.0100    0.0008
##    140        0.7665             nan     0.0100    0.0005
##    160        0.7357             nan     0.0100    0.0004
##    180        0.7087             nan     0.0100    0.0004
##    200        0.6849             nan     0.0100    0.0002
##    220        0.6633             nan     0.0100    0.0001
##    240        0.6445             nan     0.0100    0.0001
##    260        0.6290             nan     0.0100    0.0003
##    280        0.6139             nan     0.0100    0.0001
##    300        0.5991             nan     0.0100    0.0002
##    320        0.5866             nan     0.0100    0.0000
##    340        0.5747             nan     0.0100    0.0001
##    360        0.5637             nan     0.0100    0.0001
##    380        0.5525             nan     0.0100   -0.0000
##    400        0.5420             nan     0.0100    0.0001
##    420        0.5324             nan     0.0100    0.0001
##    440        0.5232             nan     0.0100   -0.0000
##    460        0.5146             nan     0.0100    0.0000
##    480        0.5059             nan     0.0100   -0.0001
##    500        0.4972             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3123             nan     0.0100    0.0041
##      2        1.3026             nan     0.0100    0.0044
##      3        1.2943             nan     0.0100    0.0039
##      4        1.2858             nan     0.0100    0.0039
##      5        1.2771             nan     0.0100    0.0038
##      6        1.2685             nan     0.0100    0.0038
##      7        1.2607             nan     0.0100    0.0037
##      8        1.2526             nan     0.0100    0.0033
##      9        1.2451             nan     0.0100    0.0031
##     10        1.2374             nan     0.0100    0.0036
##     20        1.1691             nan     0.0100    0.0029
##     40        1.0574             nan     0.0100    0.0020
##     60        0.9704             nan     0.0100    0.0015
##     80        0.9026             nan     0.0100    0.0012
##    100        0.8494             nan     0.0100    0.0009
##    120        0.8056             nan     0.0100    0.0007
##    140        0.7699             nan     0.0100    0.0005
##    160        0.7403             nan     0.0100    0.0004
##    180        0.7136             nan     0.0100    0.0004
##    200        0.6903             nan     0.0100    0.0003
##    220        0.6701             nan     0.0100    0.0003
##    240        0.6519             nan     0.0100    0.0002
##    260        0.6360             nan     0.0100    0.0001
##    280        0.6219             nan     0.0100    0.0001
##    300        0.6095             nan     0.0100    0.0002
##    320        0.5975             nan     0.0100    0.0001
##    340        0.5855             nan     0.0100   -0.0001
##    360        0.5750             nan     0.0100    0.0001
##    380        0.5638             nan     0.0100   -0.0000
##    400        0.5548             nan     0.0100    0.0000
##    420        0.5444             nan     0.0100   -0.0000
##    440        0.5361             nan     0.0100   -0.0001
##    460        0.5280             nan     0.0100   -0.0001
##    480        0.5185             nan     0.0100    0.0001
##    500        0.5091             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3126             nan     0.0100    0.0040
##      2        1.3039             nan     0.0100    0.0042
##      3        1.2952             nan     0.0100    0.0040
##      4        1.2862             nan     0.0100    0.0042
##      5        1.2775             nan     0.0100    0.0037
##      6        1.2698             nan     0.0100    0.0037
##      7        1.2618             nan     0.0100    0.0035
##      8        1.2540             nan     0.0100    0.0036
##      9        1.2465             nan     0.0100    0.0035
##     10        1.2392             nan     0.0100    0.0033
##     20        1.1698             nan     0.0100    0.0029
##     40        1.0588             nan     0.0100    0.0018
##     60        0.9730             nan     0.0100    0.0016
##     80        0.9044             nan     0.0100    0.0012
##    100        0.8525             nan     0.0100    0.0007
##    120        0.8087             nan     0.0100    0.0008
##    140        0.7723             nan     0.0100    0.0006
##    160        0.7420             nan     0.0100    0.0004
##    180        0.7160             nan     0.0100    0.0002
##    200        0.6940             nan     0.0100    0.0002
##    220        0.6739             nan     0.0100    0.0003
##    240        0.6560             nan     0.0100    0.0002
##    260        0.6403             nan     0.0100    0.0000
##    280        0.6265             nan     0.0100    0.0002
##    300        0.6133             nan     0.0100    0.0000
##    320        0.6014             nan     0.0100   -0.0002
##    340        0.5901             nan     0.0100    0.0001
##    360        0.5790             nan     0.0100   -0.0000
##    380        0.5693             nan     0.0100    0.0000
##    400        0.5596             nan     0.0100    0.0002
##    420        0.5501             nan     0.0100    0.0001
##    440        0.5418             nan     0.0100    0.0000
##    460        0.5333             nan     0.0100   -0.0001
##    480        0.5252             nan     0.0100   -0.0002
##    500        0.5183             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0044
##      2        1.3016             nan     0.0100    0.0042
##      3        1.2931             nan     0.0100    0.0039
##      4        1.2844             nan     0.0100    0.0039
##      5        1.2756             nan     0.0100    0.0043
##      6        1.2674             nan     0.0100    0.0038
##      7        1.2587             nan     0.0100    0.0040
##      8        1.2500             nan     0.0100    0.0035
##      9        1.2417             nan     0.0100    0.0034
##     10        1.2339             nan     0.0100    0.0033
##     20        1.1583             nan     0.0100    0.0034
##     40        1.0397             nan     0.0100    0.0025
##     60        0.9479             nan     0.0100    0.0018
##     80        0.8779             nan     0.0100    0.0013
##    100        0.8224             nan     0.0100    0.0007
##    120        0.7765             nan     0.0100    0.0005
##    140        0.7366             nan     0.0100    0.0004
##    160        0.7057             nan     0.0100    0.0002
##    180        0.6769             nan     0.0100    0.0004
##    200        0.6525             nan     0.0100    0.0001
##    220        0.6304             nan     0.0100    0.0002
##    240        0.6100             nan     0.0100    0.0002
##    260        0.5921             nan     0.0100    0.0001
##    280        0.5762             nan     0.0100    0.0002
##    300        0.5611             nan     0.0100    0.0001
##    320        0.5464             nan     0.0100    0.0001
##    340        0.5329             nan     0.0100    0.0000
##    360        0.5210             nan     0.0100    0.0000
##    380        0.5100             nan     0.0100   -0.0000
##    400        0.4990             nan     0.0100   -0.0001
##    420        0.4883             nan     0.0100   -0.0000
##    440        0.4792             nan     0.0100   -0.0000
##    460        0.4694             nan     0.0100   -0.0001
##    480        0.4607             nan     0.0100   -0.0001
##    500        0.4517             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0045
##      2        1.3022             nan     0.0100    0.0043
##      3        1.2932             nan     0.0100    0.0041
##      4        1.2843             nan     0.0100    0.0044
##      5        1.2752             nan     0.0100    0.0039
##      6        1.2672             nan     0.0100    0.0040
##      7        1.2589             nan     0.0100    0.0035
##      8        1.2511             nan     0.0100    0.0035
##      9        1.2428             nan     0.0100    0.0037
##     10        1.2345             nan     0.0100    0.0032
##     20        1.1602             nan     0.0100    0.0029
##     40        1.0420             nan     0.0100    0.0021
##     60        0.9519             nan     0.0100    0.0019
##     80        0.8819             nan     0.0100    0.0010
##    100        0.8250             nan     0.0100    0.0009
##    120        0.7786             nan     0.0100    0.0008
##    140        0.7401             nan     0.0100    0.0006
##    160        0.7096             nan     0.0100    0.0003
##    180        0.6804             nan     0.0100    0.0004
##    200        0.6566             nan     0.0100    0.0004
##    220        0.6357             nan     0.0100    0.0003
##    240        0.6166             nan     0.0100    0.0002
##    260        0.5997             nan     0.0100    0.0001
##    280        0.5840             nan     0.0100    0.0000
##    300        0.5677             nan     0.0100   -0.0000
##    320        0.5537             nan     0.0100    0.0001
##    340        0.5415             nan     0.0100   -0.0002
##    360        0.5290             nan     0.0100    0.0001
##    380        0.5182             nan     0.0100    0.0001
##    400        0.5075             nan     0.0100    0.0000
##    420        0.4975             nan     0.0100   -0.0000
##    440        0.4873             nan     0.0100   -0.0000
##    460        0.4777             nan     0.0100   -0.0002
##    480        0.4687             nan     0.0100    0.0001
##    500        0.4590             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0043
##      2        1.3026             nan     0.0100    0.0045
##      3        1.2936             nan     0.0100    0.0038
##      4        1.2844             nan     0.0100    0.0038
##      5        1.2757             nan     0.0100    0.0040
##      6        1.2672             nan     0.0100    0.0038
##      7        1.2583             nan     0.0100    0.0040
##      8        1.2501             nan     0.0100    0.0037
##      9        1.2423             nan     0.0100    0.0038
##     10        1.2340             nan     0.0100    0.0038
##     20        1.1616             nan     0.0100    0.0029
##     40        1.0460             nan     0.0100    0.0019
##     60        0.9583             nan     0.0100    0.0017
##     80        0.8877             nan     0.0100    0.0014
##    100        0.8331             nan     0.0100    0.0009
##    120        0.7865             nan     0.0100    0.0006
##    140        0.7500             nan     0.0100    0.0005
##    160        0.7186             nan     0.0100    0.0006
##    180        0.6894             nan     0.0100    0.0004
##    200        0.6647             nan     0.0100    0.0002
##    220        0.6433             nan     0.0100    0.0003
##    240        0.6240             nan     0.0100    0.0002
##    260        0.6074             nan     0.0100    0.0002
##    280        0.5917             nan     0.0100   -0.0001
##    300        0.5776             nan     0.0100    0.0000
##    320        0.5631             nan     0.0100   -0.0000
##    340        0.5505             nan     0.0100    0.0000
##    360        0.5381             nan     0.0100    0.0000
##    380        0.5268             nan     0.0100   -0.0001
##    400        0.5170             nan     0.0100   -0.0000
##    420        0.5065             nan     0.0100   -0.0000
##    440        0.4968             nan     0.0100   -0.0000
##    460        0.4874             nan     0.0100   -0.0001
##    480        0.4779             nan     0.0100   -0.0000
##    500        0.4692             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0045
##      2        1.3021             nan     0.0100    0.0044
##      3        1.2921             nan     0.0100    0.0045
##      4        1.2826             nan     0.0100    0.0045
##      5        1.2728             nan     0.0100    0.0045
##      6        1.2636             nan     0.0100    0.0046
##      7        1.2542             nan     0.0100    0.0041
##      8        1.2460             nan     0.0100    0.0035
##      9        1.2378             nan     0.0100    0.0038
##     10        1.2294             nan     0.0100    0.0040
##     20        1.1529             nan     0.0100    0.0031
##     40        1.0303             nan     0.0100    0.0022
##     60        0.9361             nan     0.0100    0.0016
##     80        0.8619             nan     0.0100    0.0013
##    100        0.8022             nan     0.0100    0.0010
##    120        0.7527             nan     0.0100    0.0008
##    140        0.7131             nan     0.0100    0.0006
##    160        0.6779             nan     0.0100    0.0003
##    180        0.6477             nan     0.0100    0.0003
##    200        0.6229             nan     0.0100    0.0002
##    220        0.6004             nan     0.0100    0.0003
##    240        0.5786             nan     0.0100    0.0002
##    260        0.5607             nan     0.0100    0.0000
##    280        0.5438             nan     0.0100   -0.0000
##    300        0.5271             nan     0.0100    0.0001
##    320        0.5106             nan     0.0100    0.0001
##    340        0.4974             nan     0.0100   -0.0000
##    360        0.4841             nan     0.0100    0.0000
##    380        0.4721             nan     0.0100    0.0000
##    400        0.4609             nan     0.0100   -0.0000
##    420        0.4501             nan     0.0100   -0.0000
##    440        0.4394             nan     0.0100    0.0001
##    460        0.4292             nan     0.0100    0.0001
##    480        0.4189             nan     0.0100    0.0001
##    500        0.4093             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3110             nan     0.0100    0.0046
##      2        1.3018             nan     0.0100    0.0043
##      3        1.2918             nan     0.0100    0.0040
##      4        1.2829             nan     0.0100    0.0041
##      5        1.2736             nan     0.0100    0.0043
##      6        1.2645             nan     0.0100    0.0037
##      7        1.2561             nan     0.0100    0.0036
##      8        1.2475             nan     0.0100    0.0032
##      9        1.2387             nan     0.0100    0.0038
##     10        1.2299             nan     0.0100    0.0040
##     20        1.1550             nan     0.0100    0.0031
##     40        1.0315             nan     0.0100    0.0023
##     60        0.9367             nan     0.0100    0.0016
##     80        0.8641             nan     0.0100    0.0013
##    100        0.8061             nan     0.0100    0.0009
##    120        0.7586             nan     0.0100    0.0008
##    140        0.7188             nan     0.0100    0.0006
##    160        0.6834             nan     0.0100    0.0007
##    180        0.6535             nan     0.0100    0.0004
##    200        0.6278             nan     0.0100    0.0003
##    220        0.6069             nan     0.0100    0.0002
##    240        0.5871             nan     0.0100    0.0001
##    260        0.5674             nan     0.0100   -0.0000
##    280        0.5500             nan     0.0100    0.0000
##    300        0.5347             nan     0.0100    0.0001
##    320        0.5199             nan     0.0100    0.0003
##    340        0.5052             nan     0.0100    0.0001
##    360        0.4931             nan     0.0100   -0.0000
##    380        0.4801             nan     0.0100    0.0001
##    400        0.4690             nan     0.0100   -0.0000
##    420        0.4569             nan     0.0100   -0.0001
##    440        0.4456             nan     0.0100   -0.0002
##    460        0.4345             nan     0.0100    0.0001
##    480        0.4253             nan     0.0100    0.0000
##    500        0.4155             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0042
##      2        1.3030             nan     0.0100    0.0038
##      3        1.2936             nan     0.0100    0.0042
##      4        1.2852             nan     0.0100    0.0040
##      5        1.2763             nan     0.0100    0.0042
##      6        1.2675             nan     0.0100    0.0040
##      7        1.2589             nan     0.0100    0.0036
##      8        1.2500             nan     0.0100    0.0041
##      9        1.2419             nan     0.0100    0.0041
##     10        1.2341             nan     0.0100    0.0035
##     20        1.1581             nan     0.0100    0.0032
##     40        1.0362             nan     0.0100    0.0022
##     60        0.9449             nan     0.0100    0.0020
##     80        0.8733             nan     0.0100    0.0012
##    100        0.8161             nan     0.0100    0.0009
##    120        0.7689             nan     0.0100    0.0009
##    140        0.7289             nan     0.0100    0.0007
##    160        0.6948             nan     0.0100    0.0002
##    180        0.6669             nan     0.0100    0.0004
##    200        0.6403             nan     0.0100    0.0004
##    220        0.6171             nan     0.0100    0.0003
##    240        0.5962             nan     0.0100    0.0002
##    260        0.5772             nan     0.0100    0.0003
##    280        0.5599             nan     0.0100   -0.0000
##    300        0.5450             nan     0.0100    0.0001
##    320        0.5296             nan     0.0100    0.0001
##    340        0.5159             nan     0.0100    0.0001
##    360        0.5039             nan     0.0100   -0.0001
##    380        0.4915             nan     0.0100   -0.0001
##    400        0.4801             nan     0.0100   -0.0000
##    420        0.4690             nan     0.0100   -0.0002
##    440        0.4582             nan     0.0100   -0.0001
##    460        0.4475             nan     0.0100   -0.0001
##    480        0.4374             nan     0.0100   -0.0001
##    500        0.4271             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2314             nan     0.1000    0.0401
##      2        1.1605             nan     0.1000    0.0323
##      3        1.1031             nan     0.1000    0.0242
##      4        1.0488             nan     0.1000    0.0252
##      5        1.0018             nan     0.1000    0.0196
##      6        0.9617             nan     0.1000    0.0186
##      7        0.9263             nan     0.1000    0.0135
##      8        0.8936             nan     0.1000    0.0139
##      9        0.8667             nan     0.1000    0.0114
##     10        0.8392             nan     0.1000    0.0089
##     20        0.6774             nan     0.1000    0.0022
##     40        0.5462             nan     0.1000    0.0003
##     60        0.4580             nan     0.1000   -0.0003
##     80        0.3982             nan     0.1000   -0.0000
##    100        0.3466             nan     0.1000   -0.0003
##    120        0.3086             nan     0.1000   -0.0009
##    140        0.2725             nan     0.1000   -0.0003
##    160        0.2432             nan     0.1000   -0.0003
##    180        0.2205             nan     0.1000    0.0002
##    200        0.1967             nan     0.1000   -0.0001
##    220        0.1791             nan     0.1000   -0.0004
##    240        0.1616             nan     0.1000   -0.0001
##    260        0.1459             nan     0.1000   -0.0002
##    280        0.1338             nan     0.1000   -0.0004
##    300        0.1208             nan     0.1000   -0.0002
##    320        0.1120             nan     0.1000   -0.0002
##    340        0.1023             nan     0.1000   -0.0000
##    360        0.0937             nan     0.1000   -0.0003
##    380        0.0851             nan     0.1000   -0.0002
##    400        0.0779             nan     0.1000   -0.0002
##    420        0.0720             nan     0.1000   -0.0003
##    440        0.0660             nan     0.1000   -0.0000
##    460        0.0604             nan     0.1000   -0.0003
##    480        0.0557             nan     0.1000   -0.0001
##    500        0.0506             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2399             nan     0.1000    0.0349
##      2        1.1680             nan     0.1000    0.0354
##      3        1.1105             nan     0.1000    0.0243
##      4        1.0523             nan     0.1000    0.0215
##      5        1.0117             nan     0.1000    0.0157
##      6        0.9685             nan     0.1000    0.0194
##      7        0.9385             nan     0.1000    0.0125
##      8        0.9045             nan     0.1000    0.0151
##      9        0.8763             nan     0.1000    0.0117
##     10        0.8474             nan     0.1000    0.0116
##     20        0.6930             nan     0.1000    0.0045
##     40        0.5567             nan     0.1000    0.0004
##     60        0.4782             nan     0.1000   -0.0009
##     80        0.4172             nan     0.1000   -0.0014
##    100        0.3676             nan     0.1000   -0.0001
##    120        0.3307             nan     0.1000   -0.0001
##    140        0.2938             nan     0.1000   -0.0009
##    160        0.2598             nan     0.1000    0.0000
##    180        0.2334             nan     0.1000   -0.0009
##    200        0.2072             nan     0.1000   -0.0003
##    220        0.1873             nan     0.1000   -0.0013
##    240        0.1692             nan     0.1000   -0.0001
##    260        0.1542             nan     0.1000   -0.0001
##    280        0.1390             nan     0.1000   -0.0006
##    300        0.1258             nan     0.1000   -0.0001
##    320        0.1147             nan     0.1000   -0.0002
##    340        0.1043             nan     0.1000   -0.0003
##    360        0.0959             nan     0.1000   -0.0000
##    380        0.0873             nan     0.1000   -0.0004
##    400        0.0795             nan     0.1000   -0.0003
##    420        0.0733             nan     0.1000   -0.0002
##    440        0.0674             nan     0.1000   -0.0003
##    460        0.0619             nan     0.1000   -0.0002
##    480        0.0560             nan     0.1000   -0.0001
##    500        0.0513             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2336             nan     0.1000    0.0349
##      2        1.1558             nan     0.1000    0.0339
##      3        1.0978             nan     0.1000    0.0240
##      4        1.0493             nan     0.1000    0.0195
##      5        1.0087             nan     0.1000    0.0154
##      6        0.9714             nan     0.1000    0.0175
##      7        0.9326             nan     0.1000    0.0157
##      8        0.9046             nan     0.1000    0.0123
##      9        0.8758             nan     0.1000    0.0113
##     10        0.8501             nan     0.1000    0.0104
##     20        0.7014             nan     0.1000    0.0038
##     40        0.5587             nan     0.1000    0.0008
##     60        0.4790             nan     0.1000    0.0004
##     80        0.4171             nan     0.1000   -0.0008
##    100        0.3684             nan     0.1000   -0.0001
##    120        0.3256             nan     0.1000   -0.0015
##    140        0.2939             nan     0.1000   -0.0008
##    160        0.2601             nan     0.1000   -0.0003
##    180        0.2312             nan     0.1000   -0.0003
##    200        0.2076             nan     0.1000   -0.0005
##    220        0.1886             nan     0.1000   -0.0004
##    240        0.1730             nan     0.1000   -0.0004
##    260        0.1569             nan     0.1000   -0.0008
##    280        0.1435             nan     0.1000   -0.0007
##    300        0.1303             nan     0.1000   -0.0006
##    320        0.1187             nan     0.1000   -0.0005
##    340        0.1093             nan     0.1000   -0.0003
##    360        0.1001             nan     0.1000   -0.0003
##    380        0.0922             nan     0.1000   -0.0002
##    400        0.0849             nan     0.1000   -0.0002
##    420        0.0781             nan     0.1000   -0.0002
##    440        0.0717             nan     0.1000   -0.0002
##    460        0.0663             nan     0.1000   -0.0004
##    480        0.0614             nan     0.1000   -0.0001
##    500        0.0566             nan     0.1000    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2320             nan     0.1000    0.0441
##      2        1.1591             nan     0.1000    0.0300
##      3        1.0945             nan     0.1000    0.0265
##      4        1.0439             nan     0.1000    0.0209
##      5        0.9944             nan     0.1000    0.0233
##      6        0.9463             nan     0.1000    0.0190
##      7        0.9109             nan     0.1000    0.0123
##      8        0.8783             nan     0.1000    0.0113
##      9        0.8456             nan     0.1000    0.0118
##     10        0.8213             nan     0.1000    0.0096
##     20        0.6540             nan     0.1000    0.0022
##     40        0.4973             nan     0.1000   -0.0003
##     60        0.4093             nan     0.1000   -0.0005
##     80        0.3391             nan     0.1000   -0.0015
##    100        0.2866             nan     0.1000   -0.0008
##    120        0.2459             nan     0.1000   -0.0002
##    140        0.2135             nan     0.1000   -0.0005
##    160        0.1880             nan     0.1000    0.0002
##    180        0.1627             nan     0.1000   -0.0002
##    200        0.1402             nan     0.1000   -0.0003
##    220        0.1248             nan     0.1000   -0.0003
##    240        0.1107             nan     0.1000   -0.0002
##    260        0.0993             nan     0.1000    0.0000
##    280        0.0884             nan     0.1000   -0.0003
##    300        0.0795             nan     0.1000   -0.0005
##    320        0.0701             nan     0.1000   -0.0001
##    340        0.0621             nan     0.1000   -0.0002
##    360        0.0553             nan     0.1000   -0.0000
##    380        0.0505             nan     0.1000   -0.0001
##    400        0.0453             nan     0.1000   -0.0000
##    420        0.0410             nan     0.1000   -0.0001
##    440        0.0373             nan     0.1000   -0.0002
##    460        0.0332             nan     0.1000    0.0000
##    480        0.0300             nan     0.1000   -0.0001
##    500        0.0271             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2313             nan     0.1000    0.0417
##      2        1.1568             nan     0.1000    0.0332
##      3        1.0923             nan     0.1000    0.0304
##      4        1.0340             nan     0.1000    0.0249
##      5        0.9843             nan     0.1000    0.0211
##      6        0.9444             nan     0.1000    0.0157
##      7        0.9097             nan     0.1000    0.0153
##      8        0.8808             nan     0.1000    0.0118
##      9        0.8523             nan     0.1000    0.0091
##     10        0.8277             nan     0.1000    0.0104
##     20        0.6610             nan     0.1000    0.0016
##     40        0.5180             nan     0.1000    0.0002
##     60        0.4212             nan     0.1000    0.0010
##     80        0.3612             nan     0.1000   -0.0008
##    100        0.3056             nan     0.1000   -0.0003
##    120        0.2656             nan     0.1000   -0.0010
##    140        0.2330             nan     0.1000   -0.0013
##    160        0.2003             nan     0.1000   -0.0001
##    180        0.1734             nan     0.1000   -0.0000
##    200        0.1547             nan     0.1000   -0.0004
##    220        0.1369             nan     0.1000   -0.0005
##    240        0.1217             nan     0.1000   -0.0002
##    260        0.1093             nan     0.1000   -0.0001
##    280        0.0985             nan     0.1000   -0.0004
##    300        0.0863             nan     0.1000   -0.0001
##    320        0.0769             nan     0.1000   -0.0005
##    340        0.0683             nan     0.1000   -0.0003
##    360        0.0613             nan     0.1000   -0.0002
##    380        0.0553             nan     0.1000   -0.0003
##    400        0.0490             nan     0.1000   -0.0002
##    420        0.0439             nan     0.1000   -0.0001
##    440        0.0398             nan     0.1000   -0.0001
##    460        0.0364             nan     0.1000   -0.0001
##    480        0.0329             nan     0.1000   -0.0002
##    500        0.0295             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2366             nan     0.1000    0.0414
##      2        1.1618             nan     0.1000    0.0299
##      3        1.0965             nan     0.1000    0.0290
##      4        1.0419             nan     0.1000    0.0210
##      5        0.9944             nan     0.1000    0.0208
##      6        0.9507             nan     0.1000    0.0194
##      7        0.9141             nan     0.1000    0.0150
##      8        0.8766             nan     0.1000    0.0151
##      9        0.8505             nan     0.1000    0.0088
##     10        0.8218             nan     0.1000    0.0113
##     20        0.6629             nan     0.1000    0.0031
##     40        0.5239             nan     0.1000    0.0025
##     60        0.4436             nan     0.1000   -0.0020
##     80        0.3738             nan     0.1000   -0.0006
##    100        0.3177             nan     0.1000   -0.0018
##    120        0.2717             nan     0.1000   -0.0015
##    140        0.2354             nan     0.1000   -0.0003
##    160        0.2092             nan     0.1000   -0.0010
##    180        0.1881             nan     0.1000   -0.0007
##    200        0.1647             nan     0.1000   -0.0004
##    220        0.1432             nan     0.1000   -0.0008
##    240        0.1275             nan     0.1000   -0.0003
##    260        0.1140             nan     0.1000   -0.0007
##    280        0.1014             nan     0.1000   -0.0002
##    300        0.0896             nan     0.1000   -0.0001
##    320        0.0804             nan     0.1000   -0.0003
##    340        0.0719             nan     0.1000   -0.0002
##    360        0.0650             nan     0.1000   -0.0003
##    380        0.0589             nan     0.1000   -0.0004
##    400        0.0535             nan     0.1000   -0.0002
##    420        0.0478             nan     0.1000   -0.0001
##    440        0.0434             nan     0.1000   -0.0001
##    460        0.0391             nan     0.1000   -0.0000
##    480        0.0354             nan     0.1000   -0.0001
##    500        0.0322             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2213             nan     0.1000    0.0432
##      2        1.1463             nan     0.1000    0.0338
##      3        1.0837             nan     0.1000    0.0267
##      4        1.0260             nan     0.1000    0.0244
##      5        0.9761             nan     0.1000    0.0220
##      6        0.9334             nan     0.1000    0.0194
##      7        0.8910             nan     0.1000    0.0183
##      8        0.8577             nan     0.1000    0.0122
##      9        0.8298             nan     0.1000    0.0097
##     10        0.8001             nan     0.1000    0.0101
##     20        0.6191             nan     0.1000    0.0023
##     40        0.4543             nan     0.1000   -0.0010
##     60        0.3611             nan     0.1000    0.0008
##     80        0.2914             nan     0.1000    0.0000
##    100        0.2420             nan     0.1000   -0.0011
##    120        0.2048             nan     0.1000   -0.0009
##    140        0.1749             nan     0.1000   -0.0009
##    160        0.1509             nan     0.1000   -0.0001
##    180        0.1307             nan     0.1000   -0.0003
##    200        0.1113             nan     0.1000   -0.0001
##    220        0.0967             nan     0.1000   -0.0001
##    240        0.0839             nan     0.1000   -0.0002
##    260        0.0736             nan     0.1000   -0.0001
##    280        0.0648             nan     0.1000   -0.0002
##    300        0.0562             nan     0.1000   -0.0001
##    320        0.0498             nan     0.1000   -0.0001
##    340        0.0432             nan     0.1000   -0.0002
##    360        0.0375             nan     0.1000   -0.0001
##    380        0.0334             nan     0.1000   -0.0002
##    400        0.0294             nan     0.1000   -0.0001
##    420        0.0255             nan     0.1000   -0.0000
##    440        0.0225             nan     0.1000   -0.0000
##    460        0.0200             nan     0.1000   -0.0000
##    480        0.0176             nan     0.1000   -0.0000
##    500        0.0155             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2252             nan     0.1000    0.0409
##      2        1.1526             nan     0.1000    0.0307
##      3        1.0827             nan     0.1000    0.0349
##      4        1.0284             nan     0.1000    0.0260
##      5        0.9807             nan     0.1000    0.0174
##      6        0.9402             nan     0.1000    0.0180
##      7        0.9021             nan     0.1000    0.0154
##      8        0.8692             nan     0.1000    0.0139
##      9        0.8398             nan     0.1000    0.0121
##     10        0.8081             nan     0.1000    0.0132
##     20        0.6367             nan     0.1000    0.0015
##     40        0.4728             nan     0.1000   -0.0007
##     60        0.3807             nan     0.1000    0.0008
##     80        0.3123             nan     0.1000   -0.0015
##    100        0.2593             nan     0.1000    0.0004
##    120        0.2187             nan     0.1000   -0.0012
##    140        0.1828             nan     0.1000   -0.0002
##    160        0.1556             nan     0.1000   -0.0001
##    180        0.1335             nan     0.1000   -0.0007
##    200        0.1133             nan     0.1000   -0.0001
##    220        0.0977             nan     0.1000   -0.0003
##    240        0.0842             nan     0.1000   -0.0002
##    260        0.0729             nan     0.1000   -0.0001
##    280        0.0635             nan     0.1000   -0.0002
##    300        0.0551             nan     0.1000   -0.0003
##    320        0.0483             nan     0.1000   -0.0002
##    340        0.0417             nan     0.1000   -0.0002
##    360        0.0366             nan     0.1000   -0.0000
##    380        0.0322             nan     0.1000   -0.0000
##    400        0.0279             nan     0.1000   -0.0001
##    420        0.0244             nan     0.1000   -0.0000
##    440        0.0213             nan     0.1000   -0.0001
##    460        0.0186             nan     0.1000   -0.0001
##    480        0.0162             nan     0.1000   -0.0000
##    500        0.0141             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2274             nan     0.1000    0.0426
##      2        1.1597             nan     0.1000    0.0303
##      3        1.0938             nan     0.1000    0.0281
##      4        1.0353             nan     0.1000    0.0258
##      5        0.9823             nan     0.1000    0.0193
##      6        0.9430             nan     0.1000    0.0175
##      7        0.9051             nan     0.1000    0.0159
##      8        0.8715             nan     0.1000    0.0141
##      9        0.8385             nan     0.1000    0.0122
##     10        0.8134             nan     0.1000    0.0096
##     20        0.6417             nan     0.1000    0.0017
##     40        0.4875             nan     0.1000   -0.0005
##     60        0.3988             nan     0.1000   -0.0018
##     80        0.3284             nan     0.1000   -0.0001
##    100        0.2746             nan     0.1000   -0.0002
##    120        0.2292             nan     0.1000   -0.0008
##    140        0.1931             nan     0.1000   -0.0002
##    160        0.1646             nan     0.1000   -0.0005
##    180        0.1428             nan     0.1000   -0.0004
##    200        0.1225             nan     0.1000   -0.0002
##    220        0.1072             nan     0.1000   -0.0007
##    240        0.0935             nan     0.1000   -0.0003
##    260        0.0814             nan     0.1000   -0.0002
##    280        0.0708             nan     0.1000   -0.0003
##    300        0.0632             nan     0.1000   -0.0001
##    320        0.0556             nan     0.1000   -0.0003
##    340        0.0490             nan     0.1000    0.0000
##    360        0.0434             nan     0.1000   -0.0001
##    380        0.0381             nan     0.1000   -0.0001
##    400        0.0334             nan     0.1000   -0.0001
##    420        0.0293             nan     0.1000   -0.0000
##    440        0.0259             nan     0.1000   -0.0000
##    460        0.0224             nan     0.1000   -0.0000
##    480        0.0199             nan     0.1000   -0.0000
##    500        0.0175             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0003
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0003
##      5        1.3166             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3150             nan     0.0010    0.0004
##      8        1.3142             nan     0.0010    0.0003
##      9        1.3134             nan     0.0010    0.0003
##     10        1.3126             nan     0.0010    0.0004
##     20        1.3046             nan     0.0010    0.0003
##     40        1.2889             nan     0.0010    0.0003
##     60        1.2735             nan     0.0010    0.0003
##     80        1.2586             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2305             nan     0.0010    0.0003
##    140        1.2171             nan     0.0010    0.0003
##    160        1.2038             nan     0.0010    0.0003
##    180        1.1910             nan     0.0010    0.0003
##    200        1.1786             nan     0.0010    0.0002
##    220        1.1665             nan     0.0010    0.0002
##    240        1.1549             nan     0.0010    0.0002
##    260        1.1439             nan     0.0010    0.0002
##    280        1.1329             nan     0.0010    0.0002
##    300        1.1222             nan     0.0010    0.0002
##    320        1.1120             nan     0.0010    0.0002
##    340        1.1019             nan     0.0010    0.0002
##    360        1.0920             nan     0.0010    0.0002
##    380        1.0826             nan     0.0010    0.0002
##    400        1.0734             nan     0.0010    0.0002
##    420        1.0643             nan     0.0010    0.0002
##    440        1.0556             nan     0.0010    0.0002
##    460        1.0470             nan     0.0010    0.0002
##    480        1.0387             nan     0.0010    0.0002
##    500        1.0304             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3181             nan     0.0010    0.0004
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3156             nan     0.0010    0.0005
##      7        1.3148             nan     0.0010    0.0003
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3040             nan     0.0010    0.0004
##     40        1.2886             nan     0.0010    0.0003
##     60        1.2734             nan     0.0010    0.0003
##     80        1.2588             nan     0.0010    0.0003
##    100        1.2443             nan     0.0010    0.0003
##    120        1.2305             nan     0.0010    0.0003
##    140        1.2171             nan     0.0010    0.0003
##    160        1.2040             nan     0.0010    0.0003
##    180        1.1914             nan     0.0010    0.0002
##    200        1.1789             nan     0.0010    0.0003
##    220        1.1670             nan     0.0010    0.0003
##    240        1.1554             nan     0.0010    0.0003
##    260        1.1442             nan     0.0010    0.0003
##    280        1.1335             nan     0.0010    0.0002
##    300        1.1230             nan     0.0010    0.0002
##    320        1.1127             nan     0.0010    0.0002
##    340        1.1024             nan     0.0010    0.0002
##    360        1.0926             nan     0.0010    0.0002
##    380        1.0831             nan     0.0010    0.0002
##    400        1.0739             nan     0.0010    0.0002
##    420        1.0650             nan     0.0010    0.0002
##    440        1.0564             nan     0.0010    0.0002
##    460        1.0478             nan     0.0010    0.0001
##    480        1.0393             nan     0.0010    0.0002
##    500        1.0312             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3199             nan     0.0010    0.0004
##      2        1.3190             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3165             nan     0.0010    0.0004
##      6        1.3157             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3042             nan     0.0010    0.0003
##     40        1.2884             nan     0.0010    0.0003
##     60        1.2729             nan     0.0010    0.0004
##     80        1.2582             nan     0.0010    0.0003
##    100        1.2441             nan     0.0010    0.0002
##    120        1.2304             nan     0.0010    0.0003
##    140        1.2171             nan     0.0010    0.0003
##    160        1.2042             nan     0.0010    0.0003
##    180        1.1915             nan     0.0010    0.0003
##    200        1.1794             nan     0.0010    0.0002
##    220        1.1676             nan     0.0010    0.0003
##    240        1.1561             nan     0.0010    0.0002
##    260        1.1451             nan     0.0010    0.0002
##    280        1.1342             nan     0.0010    0.0003
##    300        1.1237             nan     0.0010    0.0002
##    320        1.1135             nan     0.0010    0.0002
##    340        1.1035             nan     0.0010    0.0002
##    360        1.0938             nan     0.0010    0.0002
##    380        1.0844             nan     0.0010    0.0002
##    400        1.0750             nan     0.0010    0.0002
##    420        1.0660             nan     0.0010    0.0002
##    440        1.0573             nan     0.0010    0.0002
##    460        1.0487             nan     0.0010    0.0002
##    480        1.0405             nan     0.0010    0.0002
##    500        1.0324             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0005
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3143             nan     0.0010    0.0004
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0004
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3028             nan     0.0010    0.0004
##     40        1.2858             nan     0.0010    0.0004
##     60        1.2692             nan     0.0010    0.0004
##     80        1.2535             nan     0.0010    0.0003
##    100        1.2382             nan     0.0010    0.0003
##    120        1.2232             nan     0.0010    0.0004
##    140        1.2091             nan     0.0010    0.0003
##    160        1.1950             nan     0.0010    0.0003
##    180        1.1816             nan     0.0010    0.0003
##    200        1.1683             nan     0.0010    0.0003
##    220        1.1554             nan     0.0010    0.0003
##    240        1.1428             nan     0.0010    0.0003
##    260        1.1309             nan     0.0010    0.0003
##    280        1.1194             nan     0.0010    0.0003
##    300        1.1079             nan     0.0010    0.0002
##    320        1.0970             nan     0.0010    0.0002
##    340        1.0862             nan     0.0010    0.0002
##    360        1.0757             nan     0.0010    0.0002
##    380        1.0656             nan     0.0010    0.0002
##    400        1.0557             nan     0.0010    0.0002
##    420        1.0461             nan     0.0010    0.0002
##    440        1.0368             nan     0.0010    0.0001
##    460        1.0277             nan     0.0010    0.0001
##    480        1.0189             nan     0.0010    0.0002
##    500        1.0104             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3188             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0003
##      5        1.3162             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3030             nan     0.0010    0.0004
##     40        1.2860             nan     0.0010    0.0003
##     60        1.2699             nan     0.0010    0.0004
##     80        1.2538             nan     0.0010    0.0004
##    100        1.2385             nan     0.0010    0.0004
##    120        1.2240             nan     0.0010    0.0004
##    140        1.2097             nan     0.0010    0.0003
##    160        1.1957             nan     0.0010    0.0003
##    180        1.1824             nan     0.0010    0.0003
##    200        1.1695             nan     0.0010    0.0003
##    220        1.1570             nan     0.0010    0.0002
##    240        1.1449             nan     0.0010    0.0003
##    260        1.1329             nan     0.0010    0.0003
##    280        1.1215             nan     0.0010    0.0002
##    300        1.1105             nan     0.0010    0.0003
##    320        1.0995             nan     0.0010    0.0002
##    340        1.0888             nan     0.0010    0.0002
##    360        1.0785             nan     0.0010    0.0002
##    380        1.0686             nan     0.0010    0.0002
##    400        1.0590             nan     0.0010    0.0002
##    420        1.0496             nan     0.0010    0.0001
##    440        1.0401             nan     0.0010    0.0002
##    460        1.0311             nan     0.0010    0.0002
##    480        1.0224             nan     0.0010    0.0002
##    500        1.0138             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3198             nan     0.0010    0.0004
##      2        1.3189             nan     0.0010    0.0004
##      3        1.3180             nan     0.0010    0.0004
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0004
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3136             nan     0.0010    0.0004
##      9        1.3127             nan     0.0010    0.0004
##     10        1.3119             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2861             nan     0.0010    0.0003
##     60        1.2702             nan     0.0010    0.0003
##     80        1.2541             nan     0.0010    0.0004
##    100        1.2390             nan     0.0010    0.0004
##    120        1.2243             nan     0.0010    0.0003
##    140        1.2103             nan     0.0010    0.0003
##    160        1.1970             nan     0.0010    0.0003
##    180        1.1840             nan     0.0010    0.0003
##    200        1.1712             nan     0.0010    0.0003
##    220        1.1588             nan     0.0010    0.0003
##    240        1.1466             nan     0.0010    0.0003
##    260        1.1348             nan     0.0010    0.0003
##    280        1.1233             nan     0.0010    0.0002
##    300        1.1123             nan     0.0010    0.0002
##    320        1.1015             nan     0.0010    0.0002
##    340        1.0910             nan     0.0010    0.0002
##    360        1.0811             nan     0.0010    0.0002
##    380        1.0711             nan     0.0010    0.0002
##    400        1.0613             nan     0.0010    0.0002
##    420        1.0518             nan     0.0010    0.0002
##    440        1.0426             nan     0.0010    0.0002
##    460        1.0334             nan     0.0010    0.0002
##    480        1.0249             nan     0.0010    0.0002
##    500        1.0163             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0005
##      4        1.3167             nan     0.0010    0.0004
##      5        1.3158             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2840             nan     0.0010    0.0004
##     60        1.2667             nan     0.0010    0.0004
##     80        1.2501             nan     0.0010    0.0004
##    100        1.2340             nan     0.0010    0.0003
##    120        1.2187             nan     0.0010    0.0004
##    140        1.2038             nan     0.0010    0.0004
##    160        1.1889             nan     0.0010    0.0003
##    180        1.1745             nan     0.0010    0.0003
##    200        1.1609             nan     0.0010    0.0003
##    220        1.1477             nan     0.0010    0.0003
##    240        1.1348             nan     0.0010    0.0003
##    260        1.1222             nan     0.0010    0.0003
##    280        1.1098             nan     0.0010    0.0003
##    300        1.0981             nan     0.0010    0.0003
##    320        1.0868             nan     0.0010    0.0002
##    340        1.0758             nan     0.0010    0.0002
##    360        1.0651             nan     0.0010    0.0002
##    380        1.0546             nan     0.0010    0.0002
##    400        1.0442             nan     0.0010    0.0002
##    420        1.0341             nan     0.0010    0.0002
##    440        1.0245             nan     0.0010    0.0002
##    460        1.0149             nan     0.0010    0.0002
##    480        1.0058             nan     0.0010    0.0002
##    500        0.9970             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3178             nan     0.0010    0.0004
##      4        1.3169             nan     0.0010    0.0004
##      5        1.3160             nan     0.0010    0.0004
##      6        1.3150             nan     0.0010    0.0005
##      7        1.3140             nan     0.0010    0.0004
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2839             nan     0.0010    0.0004
##     60        1.2669             nan     0.0010    0.0003
##     80        1.2503             nan     0.0010    0.0004
##    100        1.2343             nan     0.0010    0.0004
##    120        1.2189             nan     0.0010    0.0003
##    140        1.2037             nan     0.0010    0.0003
##    160        1.1891             nan     0.0010    0.0003
##    180        1.1752             nan     0.0010    0.0003
##    200        1.1618             nan     0.0010    0.0003
##    220        1.1486             nan     0.0010    0.0003
##    240        1.1355             nan     0.0010    0.0003
##    260        1.1231             nan     0.0010    0.0003
##    280        1.1112             nan     0.0010    0.0002
##    300        1.0994             nan     0.0010    0.0003
##    320        1.0882             nan     0.0010    0.0002
##    340        1.0771             nan     0.0010    0.0002
##    360        1.0663             nan     0.0010    0.0002
##    380        1.0558             nan     0.0010    0.0002
##    400        1.0454             nan     0.0010    0.0002
##    420        1.0353             nan     0.0010    0.0002
##    440        1.0256             nan     0.0010    0.0002
##    460        1.0166             nan     0.0010    0.0002
##    480        1.0076             nan     0.0010    0.0002
##    500        0.9987             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3197             nan     0.0010    0.0004
##      2        1.3187             nan     0.0010    0.0004
##      3        1.3177             nan     0.0010    0.0004
##      4        1.3168             nan     0.0010    0.0004
##      5        1.3159             nan     0.0010    0.0004
##      6        1.3149             nan     0.0010    0.0004
##      7        1.3139             nan     0.0010    0.0005
##      8        1.3131             nan     0.0010    0.0004
##      9        1.3121             nan     0.0010    0.0004
##     10        1.3113             nan     0.0010    0.0004
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2854             nan     0.0010    0.0004
##     60        1.2683             nan     0.0010    0.0004
##     80        1.2525             nan     0.0010    0.0003
##    100        1.2368             nan     0.0010    0.0004
##    120        1.2216             nan     0.0010    0.0003
##    140        1.2069             nan     0.0010    0.0003
##    160        1.1923             nan     0.0010    0.0003
##    180        1.1783             nan     0.0010    0.0003
##    200        1.1647             nan     0.0010    0.0002
##    220        1.1515             nan     0.0010    0.0003
##    240        1.1389             nan     0.0010    0.0003
##    260        1.1265             nan     0.0010    0.0003
##    280        1.1145             nan     0.0010    0.0003
##    300        1.1029             nan     0.0010    0.0003
##    320        1.0917             nan     0.0010    0.0002
##    340        1.0806             nan     0.0010    0.0002
##    360        1.0701             nan     0.0010    0.0002
##    380        1.0596             nan     0.0010    0.0002
##    400        1.0494             nan     0.0010    0.0002
##    420        1.0399             nan     0.0010    0.0002
##    440        1.0305             nan     0.0010    0.0002
##    460        1.0214             nan     0.0010    0.0002
##    480        1.0124             nan     0.0010    0.0002
##    500        1.0037             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3129             nan     0.0100    0.0035
##      2        1.3037             nan     0.0100    0.0042
##      3        1.2967             nan     0.0100    0.0031
##      4        1.2886             nan     0.0100    0.0037
##      5        1.2806             nan     0.0100    0.0034
##      6        1.2728             nan     0.0100    0.0035
##      7        1.2650             nan     0.0100    0.0033
##      8        1.2580             nan     0.0100    0.0029
##      9        1.2508             nan     0.0100    0.0032
##     10        1.2432             nan     0.0100    0.0036
##     20        1.1779             nan     0.0100    0.0029
##     40        1.0731             nan     0.0100    0.0020
##     60        0.9909             nan     0.0100    0.0016
##     80        0.9278             nan     0.0100    0.0012
##    100        0.8764             nan     0.0100    0.0008
##    120        0.8344             nan     0.0100    0.0004
##    140        0.7997             nan     0.0100    0.0004
##    160        0.7703             nan     0.0100    0.0000
##    180        0.7457             nan     0.0100    0.0003
##    200        0.7248             nan     0.0100    0.0002
##    220        0.7062             nan     0.0100    0.0000
##    240        0.6885             nan     0.0100    0.0001
##    260        0.6737             nan     0.0100    0.0001
##    280        0.6576             nan     0.0100    0.0001
##    300        0.6441             nan     0.0100    0.0001
##    320        0.6322             nan     0.0100    0.0000
##    340        0.6197             nan     0.0100    0.0001
##    360        0.6094             nan     0.0100    0.0000
##    380        0.5983             nan     0.0100    0.0000
##    400        0.5890             nan     0.0100   -0.0000
##    420        0.5796             nan     0.0100   -0.0001
##    440        0.5706             nan     0.0100   -0.0001
##    460        0.5618             nan     0.0100   -0.0001
##    480        0.5531             nan     0.0100   -0.0002
##    500        0.5451             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0039
##      2        1.3041             nan     0.0100    0.0036
##      3        1.2960             nan     0.0100    0.0035
##      4        1.2879             nan     0.0100    0.0040
##      5        1.2799             nan     0.0100    0.0034
##      6        1.2729             nan     0.0100    0.0029
##      7        1.2653             nan     0.0100    0.0037
##      8        1.2576             nan     0.0100    0.0032
##      9        1.2501             nan     0.0100    0.0034
##     10        1.2428             nan     0.0100    0.0029
##     20        1.1771             nan     0.0100    0.0022
##     40        1.0737             nan     0.0100    0.0021
##     60        0.9929             nan     0.0100    0.0017
##     80        0.9297             nan     0.0100    0.0012
##    100        0.8790             nan     0.0100    0.0008
##    120        0.8368             nan     0.0100    0.0005
##    140        0.8034             nan     0.0100    0.0005
##    160        0.7749             nan     0.0100    0.0004
##    180        0.7491             nan     0.0100    0.0003
##    200        0.7280             nan     0.0100    0.0001
##    220        0.7097             nan     0.0100   -0.0000
##    240        0.6936             nan     0.0100   -0.0000
##    260        0.6784             nan     0.0100   -0.0000
##    280        0.6657             nan     0.0100    0.0000
##    300        0.6530             nan     0.0100    0.0002
##    320        0.6405             nan     0.0100    0.0001
##    340        0.6298             nan     0.0100   -0.0001
##    360        0.6193             nan     0.0100    0.0001
##    380        0.6095             nan     0.0100   -0.0000
##    400        0.5999             nan     0.0100   -0.0000
##    420        0.5904             nan     0.0100   -0.0001
##    440        0.5810             nan     0.0100    0.0001
##    460        0.5719             nan     0.0100   -0.0001
##    480        0.5631             nan     0.0100   -0.0001
##    500        0.5547             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3124             nan     0.0100    0.0039
##      2        1.3047             nan     0.0100    0.0035
##      3        1.2967             nan     0.0100    0.0037
##      4        1.2885             nan     0.0100    0.0038
##      5        1.2806             nan     0.0100    0.0039
##      6        1.2743             nan     0.0100    0.0031
##      7        1.2668             nan     0.0100    0.0030
##      8        1.2599             nan     0.0100    0.0032
##      9        1.2527             nan     0.0100    0.0032
##     10        1.2454             nan     0.0100    0.0031
##     20        1.1801             nan     0.0100    0.0029
##     40        1.0771             nan     0.0100    0.0021
##     60        0.9967             nan     0.0100    0.0015
##     80        0.9323             nan     0.0100    0.0011
##    100        0.8795             nan     0.0100    0.0008
##    120        0.8377             nan     0.0100    0.0005
##    140        0.8016             nan     0.0100    0.0004
##    160        0.7725             nan     0.0100    0.0005
##    180        0.7481             nan     0.0100    0.0003
##    200        0.7272             nan     0.0100    0.0002
##    220        0.7086             nan     0.0100    0.0001
##    240        0.6933             nan     0.0100    0.0001
##    260        0.6787             nan     0.0100    0.0000
##    280        0.6655             nan     0.0100   -0.0001
##    300        0.6541             nan     0.0100    0.0002
##    320        0.6415             nan     0.0100    0.0001
##    340        0.6311             nan     0.0100    0.0000
##    360        0.6213             nan     0.0100    0.0001
##    380        0.6105             nan     0.0100   -0.0001
##    400        0.6016             nan     0.0100    0.0000
##    420        0.5915             nan     0.0100   -0.0000
##    440        0.5824             nan     0.0100   -0.0001
##    460        0.5738             nan     0.0100   -0.0000
##    480        0.5664             nan     0.0100   -0.0000
##    500        0.5579             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0040
##      2        1.3034             nan     0.0100    0.0041
##      3        1.2949             nan     0.0100    0.0040
##      4        1.2863             nan     0.0100    0.0039
##      5        1.2780             nan     0.0100    0.0037
##      6        1.2702             nan     0.0100    0.0036
##      7        1.2627             nan     0.0100    0.0034
##      8        1.2549             nan     0.0100    0.0038
##      9        1.2473             nan     0.0100    0.0033
##     10        1.2396             nan     0.0100    0.0038
##     20        1.1698             nan     0.0100    0.0033
##     40        1.0584             nan     0.0100    0.0020
##     60        0.9733             nan     0.0100    0.0016
##     80        0.9061             nan     0.0100    0.0012
##    100        0.8521             nan     0.0100    0.0011
##    120        0.8075             nan     0.0100    0.0006
##    140        0.7708             nan     0.0100    0.0006
##    160        0.7394             nan     0.0100    0.0004
##    180        0.7132             nan     0.0100    0.0004
##    200        0.6910             nan     0.0100    0.0003
##    220        0.6700             nan     0.0100    0.0001
##    240        0.6509             nan     0.0100    0.0001
##    260        0.6335             nan     0.0100    0.0002
##    280        0.6182             nan     0.0100    0.0002
##    300        0.6026             nan     0.0100    0.0000
##    320        0.5886             nan     0.0100   -0.0001
##    340        0.5763             nan     0.0100    0.0000
##    360        0.5641             nan     0.0100   -0.0002
##    380        0.5536             nan     0.0100   -0.0001
##    400        0.5428             nan     0.0100   -0.0002
##    420        0.5320             nan     0.0100    0.0001
##    440        0.5211             nan     0.0100    0.0002
##    460        0.5100             nan     0.0100   -0.0001
##    480        0.5004             nan     0.0100   -0.0001
##    500        0.4924             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0038
##      2        1.3028             nan     0.0100    0.0041
##      3        1.2937             nan     0.0100    0.0040
##      4        1.2854             nan     0.0100    0.0037
##      5        1.2771             nan     0.0100    0.0032
##      6        1.2695             nan     0.0100    0.0035
##      7        1.2614             nan     0.0100    0.0036
##      8        1.2531             nan     0.0100    0.0038
##      9        1.2456             nan     0.0100    0.0037
##     10        1.2384             nan     0.0100    0.0028
##     20        1.1713             nan     0.0100    0.0028
##     40        1.0581             nan     0.0100    0.0022
##     60        0.9730             nan     0.0100    0.0015
##     80        0.9036             nan     0.0100    0.0011
##    100        0.8504             nan     0.0100    0.0007
##    120        0.8079             nan     0.0100    0.0006
##    140        0.7723             nan     0.0100    0.0005
##    160        0.7412             nan     0.0100    0.0005
##    180        0.7155             nan     0.0100    0.0005
##    200        0.6922             nan     0.0100    0.0003
##    220        0.6707             nan     0.0100    0.0001
##    240        0.6535             nan     0.0100    0.0002
##    260        0.6368             nan     0.0100    0.0001
##    280        0.6216             nan     0.0100    0.0002
##    300        0.6074             nan     0.0100   -0.0000
##    320        0.5943             nan     0.0100   -0.0002
##    340        0.5820             nan     0.0100    0.0001
##    360        0.5700             nan     0.0100   -0.0000
##    380        0.5595             nan     0.0100   -0.0001
##    400        0.5479             nan     0.0100    0.0000
##    420        0.5368             nan     0.0100   -0.0001
##    440        0.5264             nan     0.0100   -0.0001
##    460        0.5168             nan     0.0100   -0.0000
##    480        0.5068             nan     0.0100   -0.0001
##    500        0.4972             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3121             nan     0.0100    0.0035
##      2        1.3028             nan     0.0100    0.0042
##      3        1.2940             nan     0.0100    0.0042
##      4        1.2856             nan     0.0100    0.0038
##      5        1.2783             nan     0.0100    0.0031
##      6        1.2706             nan     0.0100    0.0035
##      7        1.2627             nan     0.0100    0.0036
##      8        1.2556             nan     0.0100    0.0034
##      9        1.2480             nan     0.0100    0.0033
##     10        1.2408             nan     0.0100    0.0026
##     20        1.1700             nan     0.0100    0.0032
##     40        1.0617             nan     0.0100    0.0021
##     60        0.9792             nan     0.0100    0.0011
##     80        0.9120             nan     0.0100    0.0011
##    100        0.8599             nan     0.0100    0.0008
##    120        0.8163             nan     0.0100    0.0009
##    140        0.7817             nan     0.0100    0.0004
##    160        0.7523             nan     0.0100    0.0003
##    180        0.7241             nan     0.0100    0.0005
##    200        0.7017             nan     0.0100    0.0002
##    220        0.6810             nan     0.0100    0.0001
##    240        0.6617             nan     0.0100    0.0002
##    260        0.6461             nan     0.0100   -0.0001
##    280        0.6321             nan     0.0100    0.0001
##    300        0.6178             nan     0.0100    0.0002
##    320        0.6062             nan     0.0100    0.0000
##    340        0.5932             nan     0.0100    0.0001
##    360        0.5807             nan     0.0100    0.0001
##    380        0.5697             nan     0.0100   -0.0001
##    400        0.5585             nan     0.0100   -0.0002
##    420        0.5473             nan     0.0100   -0.0001
##    440        0.5376             nan     0.0100    0.0000
##    460        0.5281             nan     0.0100   -0.0000
##    480        0.5189             nan     0.0100   -0.0001
##    500        0.5100             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3112             nan     0.0100    0.0043
##      2        1.3025             nan     0.0100    0.0039
##      3        1.2939             nan     0.0100    0.0040
##      4        1.2845             nan     0.0100    0.0041
##      5        1.2754             nan     0.0100    0.0041
##      6        1.2668             nan     0.0100    0.0038
##      7        1.2586             nan     0.0100    0.0031
##      8        1.2493             nan     0.0100    0.0042
##      9        1.2416             nan     0.0100    0.0036
##     10        1.2341             nan     0.0100    0.0036
##     20        1.1629             nan     0.0100    0.0029
##     40        1.0458             nan     0.0100    0.0020
##     60        0.9554             nan     0.0100    0.0014
##     80        0.8841             nan     0.0100    0.0011
##    100        0.8301             nan     0.0100    0.0008
##    120        0.7839             nan     0.0100    0.0006
##    140        0.7460             nan     0.0100    0.0004
##    160        0.7117             nan     0.0100    0.0005
##    180        0.6832             nan     0.0100    0.0003
##    200        0.6586             nan     0.0100    0.0004
##    220        0.6358             nan     0.0100    0.0001
##    240        0.6163             nan     0.0100    0.0003
##    260        0.5979             nan     0.0100   -0.0001
##    280        0.5813             nan     0.0100   -0.0000
##    300        0.5658             nan     0.0100    0.0001
##    320        0.5501             nan     0.0100    0.0001
##    340        0.5372             nan     0.0100   -0.0001
##    360        0.5238             nan     0.0100    0.0002
##    380        0.5116             nan     0.0100   -0.0001
##    400        0.4989             nan     0.0100    0.0000
##    420        0.4865             nan     0.0100    0.0000
##    440        0.4762             nan     0.0100   -0.0001
##    460        0.4659             nan     0.0100   -0.0000
##    480        0.4570             nan     0.0100   -0.0000
##    500        0.4472             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3113             nan     0.0100    0.0043
##      2        1.3018             nan     0.0100    0.0042
##      3        1.2935             nan     0.0100    0.0037
##      4        1.2844             nan     0.0100    0.0040
##      5        1.2753             nan     0.0100    0.0041
##      6        1.2667             nan     0.0100    0.0037
##      7        1.2588             nan     0.0100    0.0041
##      8        1.2510             nan     0.0100    0.0037
##      9        1.2423             nan     0.0100    0.0040
##     10        1.2339             nan     0.0100    0.0039
##     20        1.1605             nan     0.0100    0.0031
##     40        1.0445             nan     0.0100    0.0021
##     60        0.9549             nan     0.0100    0.0015
##     80        0.8857             nan     0.0100    0.0014
##    100        0.8286             nan     0.0100    0.0008
##    120        0.7838             nan     0.0100    0.0006
##    140        0.7463             nan     0.0100    0.0005
##    160        0.7140             nan     0.0100    0.0003
##    180        0.6853             nan     0.0100    0.0003
##    200        0.6618             nan     0.0100    0.0002
##    220        0.6400             nan     0.0100    0.0003
##    240        0.6200             nan     0.0100    0.0001
##    260        0.6029             nan     0.0100    0.0001
##    280        0.5868             nan     0.0100   -0.0001
##    300        0.5719             nan     0.0100    0.0001
##    320        0.5577             nan     0.0100    0.0001
##    340        0.5454             nan     0.0100    0.0000
##    360        0.5324             nan     0.0100   -0.0000
##    380        0.5198             nan     0.0100    0.0001
##    400        0.5078             nan     0.0100   -0.0000
##    420        0.4967             nan     0.0100    0.0001
##    440        0.4857             nan     0.0100   -0.0000
##    460        0.4770             nan     0.0100   -0.0000
##    480        0.4675             nan     0.0100    0.0000
##    500        0.4585             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3116             nan     0.0100    0.0045
##      2        1.3026             nan     0.0100    0.0038
##      3        1.2945             nan     0.0100    0.0038
##      4        1.2854             nan     0.0100    0.0041
##      5        1.2769             nan     0.0100    0.0038
##      6        1.2684             nan     0.0100    0.0038
##      7        1.2604             nan     0.0100    0.0035
##      8        1.2530             nan     0.0100    0.0034
##      9        1.2448             nan     0.0100    0.0038
##     10        1.2363             nan     0.0100    0.0039
##     20        1.1636             nan     0.0100    0.0032
##     40        1.0503             nan     0.0100    0.0019
##     60        0.9626             nan     0.0100    0.0010
##     80        0.8935             nan     0.0100    0.0014
##    100        0.8392             nan     0.0100    0.0009
##    120        0.7943             nan     0.0100    0.0006
##    140        0.7574             nan     0.0100    0.0004
##    160        0.7272             nan     0.0100    0.0005
##    180        0.6987             nan     0.0100    0.0003
##    200        0.6747             nan     0.0100    0.0001
##    220        0.6532             nan     0.0100   -0.0000
##    240        0.6342             nan     0.0100   -0.0001
##    260        0.6166             nan     0.0100    0.0000
##    280        0.6002             nan     0.0100    0.0000
##    300        0.5845             nan     0.0100    0.0001
##    320        0.5715             nan     0.0100    0.0002
##    340        0.5584             nan     0.0100   -0.0002
##    360        0.5458             nan     0.0100   -0.0001
##    380        0.5340             nan     0.0100    0.0000
##    400        0.5230             nan     0.0100    0.0000
##    420        0.5122             nan     0.0100    0.0001
##    440        0.5002             nan     0.0100    0.0001
##    460        0.4900             nan     0.0100    0.0001
##    480        0.4805             nan     0.0100   -0.0000
##    500        0.4706             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2345             nan     0.1000    0.0401
##      2        1.1729             nan     0.1000    0.0282
##      3        1.1151             nan     0.1000    0.0269
##      4        1.0683             nan     0.1000    0.0214
##      5        1.0295             nan     0.1000    0.0194
##      6        0.9887             nan     0.1000    0.0147
##      7        0.9573             nan     0.1000    0.0123
##      8        0.9280             nan     0.1000    0.0115
##      9        0.9019             nan     0.1000    0.0074
##     10        0.8770             nan     0.1000    0.0098
##     20        0.7268             nan     0.1000    0.0026
##     40        0.5977             nan     0.1000   -0.0007
##     60        0.5136             nan     0.1000   -0.0005
##     80        0.4465             nan     0.1000   -0.0005
##    100        0.3950             nan     0.1000    0.0009
##    120        0.3512             nan     0.1000   -0.0012
##    140        0.3122             nan     0.1000   -0.0004
##    160        0.2791             nan     0.1000    0.0001
##    180        0.2496             nan     0.1000   -0.0004
##    200        0.2267             nan     0.1000   -0.0009
##    220        0.2059             nan     0.1000   -0.0007
##    240        0.1874             nan     0.1000   -0.0008
##    260        0.1736             nan     0.1000   -0.0002
##    280        0.1619             nan     0.1000   -0.0005
##    300        0.1493             nan     0.1000   -0.0002
##    320        0.1372             nan     0.1000   -0.0003
##    340        0.1251             nan     0.1000    0.0001
##    360        0.1153             nan     0.1000   -0.0004
##    380        0.1057             nan     0.1000   -0.0001
##    400        0.0967             nan     0.1000   -0.0001
##    420        0.0893             nan     0.1000   -0.0003
##    440        0.0833             nan     0.1000   -0.0002
##    460        0.0757             nan     0.1000    0.0001
##    480        0.0702             nan     0.1000   -0.0001
##    500        0.0656             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2369             nan     0.1000    0.0396
##      2        1.1715             nan     0.1000    0.0296
##      3        1.1201             nan     0.1000    0.0225
##      4        1.0728             nan     0.1000    0.0202
##      5        1.0336             nan     0.1000    0.0167
##      6        0.9942             nan     0.1000    0.0170
##      7        0.9602             nan     0.1000    0.0107
##      8        0.9294             nan     0.1000    0.0107
##      9        0.8992             nan     0.1000    0.0116
##     10        0.8719             nan     0.1000    0.0101
##     20        0.7256             nan     0.1000    0.0024
##     40        0.5916             nan     0.1000    0.0000
##     60        0.5171             nan     0.1000   -0.0012
##     80        0.4477             nan     0.1000   -0.0003
##    100        0.3955             nan     0.1000   -0.0013
##    120        0.3519             nan     0.1000   -0.0008
##    140        0.3196             nan     0.1000   -0.0000
##    160        0.2869             nan     0.1000   -0.0001
##    180        0.2604             nan     0.1000   -0.0004
##    200        0.2367             nan     0.1000   -0.0010
##    220        0.2190             nan     0.1000   -0.0006
##    240        0.1977             nan     0.1000   -0.0013
##    260        0.1806             nan     0.1000   -0.0003
##    280        0.1631             nan     0.1000   -0.0003
##    300        0.1504             nan     0.1000   -0.0004
##    320        0.1372             nan     0.1000   -0.0000
##    340        0.1255             nan     0.1000   -0.0003
##    360        0.1170             nan     0.1000   -0.0008
##    380        0.1085             nan     0.1000   -0.0004
##    400        0.0994             nan     0.1000   -0.0001
##    420        0.0910             nan     0.1000   -0.0004
##    440        0.0845             nan     0.1000   -0.0003
##    460        0.0785             nan     0.1000   -0.0000
##    480        0.0724             nan     0.1000   -0.0001
##    500        0.0666             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2431             nan     0.1000    0.0378
##      2        1.1782             nan     0.1000    0.0298
##      3        1.1216             nan     0.1000    0.0261
##      4        1.0745             nan     0.1000    0.0189
##      5        1.0298             nan     0.1000    0.0191
##      6        0.9929             nan     0.1000    0.0146
##      7        0.9631             nan     0.1000    0.0121
##      8        0.9307             nan     0.1000    0.0123
##      9        0.9023             nan     0.1000    0.0122
##     10        0.8818             nan     0.1000    0.0071
##     20        0.7278             nan     0.1000    0.0026
##     40        0.6070             nan     0.1000   -0.0006
##     60        0.5341             nan     0.1000    0.0005
##     80        0.4740             nan     0.1000   -0.0007
##    100        0.4226             nan     0.1000   -0.0003
##    120        0.3808             nan     0.1000   -0.0011
##    140        0.3447             nan     0.1000   -0.0006
##    160        0.3136             nan     0.1000   -0.0015
##    180        0.2837             nan     0.1000   -0.0013
##    200        0.2569             nan     0.1000   -0.0008
##    220        0.2336             nan     0.1000   -0.0002
##    240        0.2160             nan     0.1000   -0.0006
##    260        0.1945             nan     0.1000   -0.0001
##    280        0.1773             nan     0.1000   -0.0007
##    300        0.1629             nan     0.1000   -0.0007
##    320        0.1499             nan     0.1000   -0.0004
##    340        0.1378             nan     0.1000   -0.0002
##    360        0.1263             nan     0.1000   -0.0004
##    380        0.1164             nan     0.1000   -0.0002
##    400        0.1077             nan     0.1000   -0.0001
##    420        0.0987             nan     0.1000   -0.0002
##    440        0.0919             nan     0.1000   -0.0001
##    460        0.0857             nan     0.1000   -0.0004
##    480        0.0791             nan     0.1000   -0.0002
##    500        0.0730             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2373             nan     0.1000    0.0384
##      2        1.1678             nan     0.1000    0.0297
##      3        1.1107             nan     0.1000    0.0258
##      4        1.0557             nan     0.1000    0.0220
##      5        1.0125             nan     0.1000    0.0161
##      6        0.9763             nan     0.1000    0.0147
##      7        0.9401             nan     0.1000    0.0152
##      8        0.9068             nan     0.1000    0.0130
##      9        0.8778             nan     0.1000    0.0101
##     10        0.8511             nan     0.1000    0.0116
##     20        0.6976             nan     0.1000    0.0019
##     40        0.5496             nan     0.1000    0.0004
##     60        0.4642             nan     0.1000   -0.0009
##     80        0.3916             nan     0.1000   -0.0016
##    100        0.3333             nan     0.1000   -0.0016
##    120        0.2878             nan     0.1000    0.0000
##    140        0.2527             nan     0.1000   -0.0013
##    160        0.2210             nan     0.1000    0.0000
##    180        0.1970             nan     0.1000   -0.0001
##    200        0.1731             nan     0.1000   -0.0002
##    220        0.1539             nan     0.1000   -0.0004
##    240        0.1379             nan     0.1000   -0.0003
##    260        0.1238             nan     0.1000   -0.0006
##    280        0.1101             nan     0.1000   -0.0003
##    300        0.0986             nan     0.1000    0.0000
##    320        0.0884             nan     0.1000   -0.0003
##    340        0.0794             nan     0.1000   -0.0002
##    360        0.0714             nan     0.1000   -0.0001
##    380        0.0647             nan     0.1000   -0.0002
##    400        0.0589             nan     0.1000   -0.0002
##    420        0.0535             nan     0.1000   -0.0002
##    440        0.0485             nan     0.1000   -0.0001
##    460        0.0437             nan     0.1000   -0.0001
##    480        0.0399             nan     0.1000   -0.0000
##    500        0.0357             nan     0.1000   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2354             nan     0.1000    0.0382
##      2        1.1642             nan     0.1000    0.0326
##      3        1.1065             nan     0.1000    0.0210
##      4        1.0566             nan     0.1000    0.0186
##      5        1.0150             nan     0.1000    0.0158
##      6        0.9727             nan     0.1000    0.0146
##      7        0.9390             nan     0.1000    0.0155
##      8        0.9113             nan     0.1000    0.0113
##      9        0.8829             nan     0.1000    0.0110
##     10        0.8586             nan     0.1000    0.0095
##     20        0.7028             nan     0.1000    0.0027
##     40        0.5604             nan     0.1000    0.0003
##     60        0.4822             nan     0.1000   -0.0004
##     80        0.4148             nan     0.1000   -0.0008
##    100        0.3607             nan     0.1000   -0.0006
##    120        0.3038             nan     0.1000   -0.0001
##    140        0.2657             nan     0.1000   -0.0005
##    160        0.2327             nan     0.1000   -0.0010
##    180        0.2045             nan     0.1000   -0.0008
##    200        0.1807             nan     0.1000   -0.0002
##    220        0.1590             nan     0.1000   -0.0009
##    240        0.1426             nan     0.1000   -0.0005
##    260        0.1277             nan     0.1000   -0.0006
##    280        0.1143             nan     0.1000   -0.0006
##    300        0.1015             nan     0.1000   -0.0003
##    320        0.0900             nan     0.1000   -0.0003
##    340        0.0813             nan     0.1000   -0.0001
##    360        0.0718             nan     0.1000   -0.0002
##    380        0.0651             nan     0.1000   -0.0002
##    400        0.0586             nan     0.1000   -0.0001
##    420        0.0535             nan     0.1000   -0.0002
##    440        0.0482             nan     0.1000   -0.0001
##    460        0.0436             nan     0.1000   -0.0001
##    480        0.0397             nan     0.1000   -0.0002
##    500        0.0362             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2396             nan     0.1000    0.0358
##      2        1.1789             nan     0.1000    0.0281
##      3        1.1174             nan     0.1000    0.0267
##      4        1.0672             nan     0.1000    0.0199
##      5        1.0220             nan     0.1000    0.0203
##      6        0.9810             nan     0.1000    0.0191
##      7        0.9399             nan     0.1000    0.0125
##      8        0.9087             nan     0.1000    0.0112
##      9        0.8845             nan     0.1000    0.0079
##     10        0.8552             nan     0.1000    0.0128
##     20        0.7056             nan     0.1000    0.0011
##     40        0.5704             nan     0.1000   -0.0011
##     60        0.4808             nan     0.1000   -0.0011
##     80        0.4099             nan     0.1000   -0.0005
##    100        0.3566             nan     0.1000   -0.0015
##    120        0.3099             nan     0.1000    0.0001
##    140        0.2724             nan     0.1000   -0.0014
##    160        0.2423             nan     0.1000   -0.0015
##    180        0.2183             nan     0.1000   -0.0011
##    200        0.1937             nan     0.1000    0.0001
##    220        0.1743             nan     0.1000   -0.0008
##    240        0.1567             nan     0.1000   -0.0004
##    260        0.1391             nan     0.1000   -0.0006
##    280        0.1241             nan     0.1000   -0.0005
##    300        0.1116             nan     0.1000   -0.0004
##    320        0.1019             nan     0.1000   -0.0003
##    340        0.0927             nan     0.1000   -0.0003
##    360        0.0836             nan     0.1000   -0.0004
##    380        0.0767             nan     0.1000   -0.0005
##    400        0.0694             nan     0.1000   -0.0001
##    420        0.0621             nan     0.1000   -0.0002
##    440        0.0567             nan     0.1000   -0.0001
##    460        0.0511             nan     0.1000   -0.0002
##    480        0.0463             nan     0.1000   -0.0001
##    500        0.0422             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2341             nan     0.1000    0.0376
##      2        1.1594             nan     0.1000    0.0335
##      3        1.0906             nan     0.1000    0.0306
##      4        1.0344             nan     0.1000    0.0223
##      5        0.9807             nan     0.1000    0.0213
##      6        0.9393             nan     0.1000    0.0149
##      7        0.9021             nan     0.1000    0.0150
##      8        0.8742             nan     0.1000    0.0092
##      9        0.8472             nan     0.1000    0.0085
##     10        0.8208             nan     0.1000    0.0090
##     20        0.6673             nan     0.1000    0.0015
##     40        0.5124             nan     0.1000    0.0003
##     60        0.4152             nan     0.1000   -0.0015
##     80        0.3504             nan     0.1000   -0.0008
##    100        0.2922             nan     0.1000    0.0009
##    120        0.2450             nan     0.1000    0.0000
##    140        0.2089             nan     0.1000   -0.0005
##    160        0.1760             nan     0.1000   -0.0001
##    180        0.1519             nan     0.1000   -0.0006
##    200        0.1317             nan     0.1000   -0.0000
##    220        0.1145             nan     0.1000   -0.0002
##    240        0.0995             nan     0.1000   -0.0002
##    260        0.0869             nan     0.1000   -0.0003
##    280        0.0760             nan     0.1000   -0.0002
##    300        0.0673             nan     0.1000    0.0000
##    320        0.0598             nan     0.1000   -0.0002
##    340        0.0525             nan     0.1000   -0.0001
##    360        0.0456             nan     0.1000   -0.0001
##    380        0.0399             nan     0.1000   -0.0001
##    400        0.0351             nan     0.1000   -0.0000
##    420        0.0311             nan     0.1000   -0.0001
##    440        0.0280             nan     0.1000   -0.0001
##    460        0.0251             nan     0.1000   -0.0001
##    480        0.0224             nan     0.1000   -0.0001
##    500        0.0198             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2265             nan     0.1000    0.0401
##      2        1.1588             nan     0.1000    0.0316
##      3        1.0868             nan     0.1000    0.0272
##      4        1.0326             nan     0.1000    0.0219
##      5        0.9892             nan     0.1000    0.0173
##      6        0.9531             nan     0.1000    0.0183
##      7        0.9182             nan     0.1000    0.0127
##      8        0.8882             nan     0.1000    0.0104
##      9        0.8553             nan     0.1000    0.0131
##     10        0.8290             nan     0.1000    0.0100
##     20        0.6713             nan     0.1000    0.0020
##     40        0.5188             nan     0.1000    0.0012
##     60        0.4193             nan     0.1000    0.0000
##     80        0.3447             nan     0.1000   -0.0005
##    100        0.2882             nan     0.1000   -0.0015
##    120        0.2441             nan     0.1000   -0.0007
##    140        0.2083             nan     0.1000   -0.0003
##    160        0.1793             nan     0.1000   -0.0008
##    180        0.1549             nan     0.1000   -0.0006
##    200        0.1321             nan     0.1000   -0.0004
##    220        0.1174             nan     0.1000   -0.0004
##    240        0.1016             nan     0.1000   -0.0001
##    260        0.0893             nan     0.1000   -0.0004
##    280        0.0793             nan     0.1000   -0.0002
##    300        0.0700             nan     0.1000   -0.0005
##    320        0.0615             nan     0.1000   -0.0003
##    340        0.0544             nan     0.1000   -0.0001
##    360        0.0486             nan     0.1000   -0.0001
##    380        0.0429             nan     0.1000    0.0001
##    400        0.0381             nan     0.1000   -0.0001
##    420        0.0336             nan     0.1000   -0.0001
##    440        0.0300             nan     0.1000   -0.0002
##    460        0.0268             nan     0.1000   -0.0001
##    480        0.0238             nan     0.1000   -0.0000
##    500        0.0213             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2317             nan     0.1000    0.0400
##      2        1.1610             nan     0.1000    0.0317
##      3        1.0991             nan     0.1000    0.0291
##      4        1.0479             nan     0.1000    0.0223
##      5        1.0003             nan     0.1000    0.0179
##      6        0.9550             nan     0.1000    0.0194
##      7        0.9231             nan     0.1000    0.0121
##      8        0.8910             nan     0.1000    0.0126
##      9        0.8598             nan     0.1000    0.0104
##     10        0.8330             nan     0.1000    0.0076
##     20        0.6760             nan     0.1000    0.0016
##     40        0.5283             nan     0.1000   -0.0007
##     60        0.4366             nan     0.1000   -0.0006
##     80        0.3673             nan     0.1000   -0.0011
##    100        0.3103             nan     0.1000   -0.0008
##    120        0.2665             nan     0.1000   -0.0008
##    140        0.2295             nan     0.1000   -0.0008
##    160        0.1961             nan     0.1000   -0.0001
##    180        0.1708             nan     0.1000   -0.0002
##    200        0.1488             nan     0.1000   -0.0003
##    220        0.1296             nan     0.1000   -0.0004
##    240        0.1131             nan     0.1000   -0.0003
##    260        0.1007             nan     0.1000   -0.0008
##    280        0.0867             nan     0.1000   -0.0003
##    300        0.0754             nan     0.1000   -0.0002
##    320        0.0672             nan     0.1000   -0.0001
##    340        0.0590             nan     0.1000   -0.0003
##    360        0.0518             nan     0.1000   -0.0001
##    380        0.0458             nan     0.1000   -0.0001
##    400        0.0409             nan     0.1000   -0.0002
##    420        0.0364             nan     0.1000   -0.0001
##    440        0.0330             nan     0.1000   -0.0001
##    460        0.0293             nan     0.1000   -0.0001
##    480        0.0262             nan     0.1000   -0.0001
##    500        0.0233             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0004
##      2        1.3194             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3140             nan     0.0010    0.0004
##      9        1.3131             nan     0.0010    0.0005
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2698             nan     0.0010    0.0004
##     80        1.2535             nan     0.0010    0.0004
##    100        1.2385             nan     0.0010    0.0004
##    120        1.2235             nan     0.0010    0.0003
##    140        1.2090             nan     0.0010    0.0003
##    160        1.1951             nan     0.0010    0.0003
##    180        1.1818             nan     0.0010    0.0003
##    200        1.1687             nan     0.0010    0.0003
##    220        1.1561             nan     0.0010    0.0003
##    240        1.1442             nan     0.0010    0.0003
##    260        1.1325             nan     0.0010    0.0002
##    280        1.1208             nan     0.0010    0.0003
##    300        1.1098             nan     0.0010    0.0003
##    320        1.0990             nan     0.0010    0.0002
##    340        1.0885             nan     0.0010    0.0002
##    360        1.0782             nan     0.0010    0.0002
##    380        1.0681             nan     0.0010    0.0002
##    400        1.0586             nan     0.0010    0.0002
##    420        1.0495             nan     0.0010    0.0002
##    440        1.0404             nan     0.0010    0.0002
##    460        1.0316             nan     0.0010    0.0002
##    480        1.0232             nan     0.0010    0.0002
##    500        1.0148             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0004
##      2        1.3195             nan     0.0010    0.0004
##      3        1.3186             nan     0.0010    0.0004
##      4        1.3178             nan     0.0010    0.0004
##      5        1.3168             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3151             nan     0.0010    0.0004
##      8        1.3143             nan     0.0010    0.0004
##      9        1.3134             nan     0.0010    0.0004
##     10        1.3125             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2865             nan     0.0010    0.0004
##     60        1.2700             nan     0.0010    0.0004
##     80        1.2540             nan     0.0010    0.0003
##    100        1.2393             nan     0.0010    0.0004
##    120        1.2249             nan     0.0010    0.0004
##    140        1.2106             nan     0.0010    0.0003
##    160        1.1966             nan     0.0010    0.0003
##    180        1.1831             nan     0.0010    0.0003
##    200        1.1702             nan     0.0010    0.0003
##    220        1.1575             nan     0.0010    0.0003
##    240        1.1453             nan     0.0010    0.0003
##    260        1.1338             nan     0.0010    0.0003
##    280        1.1221             nan     0.0010    0.0002
##    300        1.1109             nan     0.0010    0.0003
##    320        1.1001             nan     0.0010    0.0002
##    340        1.0897             nan     0.0010    0.0002
##    360        1.0797             nan     0.0010    0.0002
##    380        1.0696             nan     0.0010    0.0002
##    400        1.0601             nan     0.0010    0.0002
##    420        1.0506             nan     0.0010    0.0002
##    440        1.0416             nan     0.0010    0.0002
##    460        1.0328             nan     0.0010    0.0002
##    480        1.0244             nan     0.0010    0.0002
##    500        1.0160             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3204             nan     0.0010    0.0003
##      2        1.3194             nan     0.0010    0.0005
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3176             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3158             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3141             nan     0.0010    0.0004
##      9        1.3132             nan     0.0010    0.0004
##     10        1.3123             nan     0.0010    0.0004
##     20        1.3035             nan     0.0010    0.0004
##     40        1.2864             nan     0.0010    0.0004
##     60        1.2701             nan     0.0010    0.0004
##     80        1.2545             nan     0.0010    0.0003
##    100        1.2393             nan     0.0010    0.0004
##    120        1.2247             nan     0.0010    0.0003
##    140        1.2103             nan     0.0010    0.0003
##    160        1.1971             nan     0.0010    0.0003
##    180        1.1836             nan     0.0010    0.0003
##    200        1.1711             nan     0.0010    0.0003
##    220        1.1586             nan     0.0010    0.0002
##    240        1.1467             nan     0.0010    0.0003
##    260        1.1350             nan     0.0010    0.0002
##    280        1.1238             nan     0.0010    0.0002
##    300        1.1128             nan     0.0010    0.0002
##    320        1.1023             nan     0.0010    0.0003
##    340        1.0918             nan     0.0010    0.0003
##    360        1.0817             nan     0.0010    0.0002
##    380        1.0719             nan     0.0010    0.0002
##    400        1.0625             nan     0.0010    0.0002
##    420        1.0532             nan     0.0010    0.0002
##    440        1.0442             nan     0.0010    0.0002
##    460        1.0355             nan     0.0010    0.0002
##    480        1.0271             nan     0.0010    0.0002
##    500        1.0188             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0005
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3163             nan     0.0010    0.0005
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0004
##      9        1.3125             nan     0.0010    0.0004
##     10        1.3116             nan     0.0010    0.0005
##     20        1.3023             nan     0.0010    0.0004
##     40        1.2842             nan     0.0010    0.0004
##     60        1.2666             nan     0.0010    0.0004
##     80        1.2498             nan     0.0010    0.0003
##    100        1.2337             nan     0.0010    0.0004
##    120        1.2183             nan     0.0010    0.0003
##    140        1.2030             nan     0.0010    0.0003
##    160        1.1884             nan     0.0010    0.0004
##    180        1.1741             nan     0.0010    0.0003
##    200        1.1606             nan     0.0010    0.0003
##    220        1.1477             nan     0.0010    0.0002
##    240        1.1351             nan     0.0010    0.0003
##    260        1.1227             nan     0.0010    0.0003
##    280        1.1106             nan     0.0010    0.0002
##    300        1.0989             nan     0.0010    0.0003
##    320        1.0875             nan     0.0010    0.0002
##    340        1.0765             nan     0.0010    0.0003
##    360        1.0657             nan     0.0010    0.0002
##    380        1.0556             nan     0.0010    0.0002
##    400        1.0455             nan     0.0010    0.0002
##    420        1.0356             nan     0.0010    0.0002
##    440        1.0260             nan     0.0010    0.0002
##    460        1.0166             nan     0.0010    0.0002
##    480        1.0077             nan     0.0010    0.0002
##    500        0.9989             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3145             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0005
##     10        1.3117             nan     0.0010    0.0004
##     20        1.3026             nan     0.0010    0.0004
##     40        1.2845             nan     0.0010    0.0004
##     60        1.2674             nan     0.0010    0.0004
##     80        1.2509             nan     0.0010    0.0004
##    100        1.2350             nan     0.0010    0.0003
##    120        1.2195             nan     0.0010    0.0003
##    140        1.2044             nan     0.0010    0.0004
##    160        1.1898             nan     0.0010    0.0003
##    180        1.1757             nan     0.0010    0.0003
##    200        1.1620             nan     0.0010    0.0003
##    220        1.1486             nan     0.0010    0.0003
##    240        1.1360             nan     0.0010    0.0003
##    260        1.1238             nan     0.0010    0.0003
##    280        1.1117             nan     0.0010    0.0003
##    300        1.1001             nan     0.0010    0.0003
##    320        1.0888             nan     0.0010    0.0002
##    340        1.0778             nan     0.0010    0.0002
##    360        1.0673             nan     0.0010    0.0002
##    380        1.0570             nan     0.0010    0.0003
##    400        1.0466             nan     0.0010    0.0002
##    420        1.0366             nan     0.0010    0.0002
##    440        1.0275             nan     0.0010    0.0002
##    460        1.0181             nan     0.0010    0.0002
##    480        1.0092             nan     0.0010    0.0002
##    500        1.0003             nan     0.0010    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3185             nan     0.0010    0.0004
##      4        1.3175             nan     0.0010    0.0004
##      5        1.3167             nan     0.0010    0.0004
##      6        1.3159             nan     0.0010    0.0004
##      7        1.3149             nan     0.0010    0.0004
##      8        1.3139             nan     0.0010    0.0004
##      9        1.3130             nan     0.0010    0.0004
##     10        1.3121             nan     0.0010    0.0004
##     20        1.3029             nan     0.0010    0.0004
##     40        1.2851             nan     0.0010    0.0004
##     60        1.2679             nan     0.0010    0.0004
##     80        1.2515             nan     0.0010    0.0004
##    100        1.2356             nan     0.0010    0.0004
##    120        1.2204             nan     0.0010    0.0003
##    140        1.2055             nan     0.0010    0.0004
##    160        1.1907             nan     0.0010    0.0003
##    180        1.1769             nan     0.0010    0.0003
##    200        1.1634             nan     0.0010    0.0003
##    220        1.1507             nan     0.0010    0.0003
##    240        1.1381             nan     0.0010    0.0002
##    260        1.1256             nan     0.0010    0.0003
##    280        1.1139             nan     0.0010    0.0002
##    300        1.1024             nan     0.0010    0.0003
##    320        1.0913             nan     0.0010    0.0002
##    340        1.0806             nan     0.0010    0.0002
##    360        1.0702             nan     0.0010    0.0003
##    380        1.0598             nan     0.0010    0.0002
##    400        1.0498             nan     0.0010    0.0002
##    420        1.0400             nan     0.0010    0.0002
##    440        1.0306             nan     0.0010    0.0002
##    460        1.0216             nan     0.0010    0.0002
##    480        1.0125             nan     0.0010    0.0002
##    500        1.0038             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3191             nan     0.0010    0.0005
##      3        1.3181             nan     0.0010    0.0005
##      4        1.3171             nan     0.0010    0.0004
##      5        1.3161             nan     0.0010    0.0005
##      6        1.3151             nan     0.0010    0.0004
##      7        1.3142             nan     0.0010    0.0005
##      8        1.3132             nan     0.0010    0.0004
##      9        1.3123             nan     0.0010    0.0004
##     10        1.3112             nan     0.0010    0.0004
##     20        1.3014             nan     0.0010    0.0004
##     40        1.2827             nan     0.0010    0.0004
##     60        1.2645             nan     0.0010    0.0004
##     80        1.2468             nan     0.0010    0.0003
##    100        1.2303             nan     0.0010    0.0004
##    120        1.2139             nan     0.0010    0.0003
##    140        1.1983             nan     0.0010    0.0004
##    160        1.1831             nan     0.0010    0.0003
##    180        1.1683             nan     0.0010    0.0003
##    200        1.1543             nan     0.0010    0.0003
##    220        1.1405             nan     0.0010    0.0003
##    240        1.1271             nan     0.0010    0.0003
##    260        1.1144             nan     0.0010    0.0003
##    280        1.1020             nan     0.0010    0.0003
##    300        1.0898             nan     0.0010    0.0003
##    320        1.0780             nan     0.0010    0.0003
##    340        1.0666             nan     0.0010    0.0002
##    360        1.0553             nan     0.0010    0.0002
##    380        1.0445             nan     0.0010    0.0002
##    400        1.0340             nan     0.0010    0.0002
##    420        1.0239             nan     0.0010    0.0002
##    440        1.0141             nan     0.0010    0.0002
##    460        1.0044             nan     0.0010    0.0002
##    480        0.9951             nan     0.0010    0.0002
##    500        0.9860             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3203             nan     0.0010    0.0004
##      2        1.3193             nan     0.0010    0.0004
##      3        1.3184             nan     0.0010    0.0004
##      4        1.3174             nan     0.0010    0.0004
##      5        1.3164             nan     0.0010    0.0004
##      6        1.3155             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0005
##      8        1.3135             nan     0.0010    0.0004
##      9        1.3126             nan     0.0010    0.0005
##     10        1.3116             nan     0.0010    0.0004
##     20        1.3018             nan     0.0010    0.0004
##     40        1.2834             nan     0.0010    0.0004
##     60        1.2653             nan     0.0010    0.0004
##     80        1.2483             nan     0.0010    0.0004
##    100        1.2315             nan     0.0010    0.0003
##    120        1.2153             nan     0.0010    0.0004
##    140        1.1999             nan     0.0010    0.0004
##    160        1.1848             nan     0.0010    0.0003
##    180        1.1706             nan     0.0010    0.0003
##    200        1.1565             nan     0.0010    0.0003
##    220        1.1429             nan     0.0010    0.0003
##    240        1.1298             nan     0.0010    0.0003
##    260        1.1172             nan     0.0010    0.0003
##    280        1.1049             nan     0.0010    0.0003
##    300        1.0926             nan     0.0010    0.0002
##    320        1.0807             nan     0.0010    0.0003
##    340        1.0695             nan     0.0010    0.0002
##    360        1.0582             nan     0.0010    0.0002
##    380        1.0477             nan     0.0010    0.0002
##    400        1.0369             nan     0.0010    0.0003
##    420        1.0267             nan     0.0010    0.0002
##    440        1.0167             nan     0.0010    0.0002
##    460        1.0072             nan     0.0010    0.0002
##    480        0.9978             nan     0.0010    0.0002
##    500        0.9889             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3202             nan     0.0010    0.0005
##      2        1.3192             nan     0.0010    0.0004
##      3        1.3182             nan     0.0010    0.0005
##      4        1.3173             nan     0.0010    0.0005
##      5        1.3163             nan     0.0010    0.0005
##      6        1.3153             nan     0.0010    0.0004
##      7        1.3144             nan     0.0010    0.0004
##      8        1.3134             nan     0.0010    0.0005
##      9        1.3124             nan     0.0010    0.0005
##     10        1.3115             nan     0.0010    0.0004
##     20        1.3020             nan     0.0010    0.0004
##     40        1.2837             nan     0.0010    0.0004
##     60        1.2662             nan     0.0010    0.0004
##     80        1.2491             nan     0.0010    0.0004
##    100        1.2329             nan     0.0010    0.0004
##    120        1.2170             nan     0.0010    0.0004
##    140        1.2015             nan     0.0010    0.0003
##    160        1.1866             nan     0.0010    0.0003
##    180        1.1721             nan     0.0010    0.0003
##    200        1.1581             nan     0.0010    0.0003
##    220        1.1448             nan     0.0010    0.0003
##    240        1.1318             nan     0.0010    0.0003
##    260        1.1192             nan     0.0010    0.0002
##    280        1.1070             nan     0.0010    0.0003
##    300        1.0954             nan     0.0010    0.0002
##    320        1.0840             nan     0.0010    0.0002
##    340        1.0727             nan     0.0010    0.0002
##    360        1.0621             nan     0.0010    0.0002
##    380        1.0515             nan     0.0010    0.0002
##    400        1.0412             nan     0.0010    0.0002
##    420        1.0312             nan     0.0010    0.0002
##    440        1.0216             nan     0.0010    0.0002
##    460        1.0122             nan     0.0010    0.0002
##    480        1.0031             nan     0.0010    0.0002
##    500        0.9940             nan     0.0010    0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3120             nan     0.0100    0.0044
##      2        1.3040             nan     0.0100    0.0033
##      3        1.2946             nan     0.0100    0.0045
##      4        1.2860             nan     0.0100    0.0036
##      5        1.2770             nan     0.0100    0.0042
##      6        1.2686             nan     0.0100    0.0039
##      7        1.2610             nan     0.0100    0.0035
##      8        1.2536             nan     0.0100    0.0038
##      9        1.2461             nan     0.0100    0.0034
##     10        1.2385             nan     0.0100    0.0037
##     20        1.1692             nan     0.0100    0.0031
##     40        1.0587             nan     0.0100    0.0021
##     60        0.9767             nan     0.0100    0.0016
##     80        0.9117             nan     0.0100    0.0011
##    100        0.8619             nan     0.0100    0.0006
##    120        0.8197             nan     0.0100    0.0006
##    140        0.7844             nan     0.0100    0.0004
##    160        0.7542             nan     0.0100    0.0004
##    180        0.7301             nan     0.0100    0.0004
##    200        0.7076             nan     0.0100    0.0005
##    220        0.6885             nan     0.0100    0.0004
##    240        0.6715             nan     0.0100    0.0001
##    260        0.6558             nan     0.0100    0.0000
##    280        0.6415             nan     0.0100    0.0002
##    300        0.6290             nan     0.0100   -0.0000
##    320        0.6164             nan     0.0100    0.0002
##    340        0.6046             nan     0.0100   -0.0000
##    360        0.5940             nan     0.0100    0.0001
##    380        0.5837             nan     0.0100   -0.0001
##    400        0.5738             nan     0.0100   -0.0001
##    420        0.5646             nan     0.0100    0.0000
##    440        0.5558             nan     0.0100   -0.0000
##    460        0.5472             nan     0.0100    0.0001
##    480        0.5386             nan     0.0100   -0.0001
##    500        0.5305             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0046
##      2        1.3036             nan     0.0100    0.0040
##      3        1.2956             nan     0.0100    0.0035
##      4        1.2875             nan     0.0100    0.0042
##      5        1.2794             nan     0.0100    0.0034
##      6        1.2710             nan     0.0100    0.0038
##      7        1.2624             nan     0.0100    0.0038
##      8        1.2548             nan     0.0100    0.0038
##      9        1.2461             nan     0.0100    0.0037
##     10        1.2386             nan     0.0100    0.0036
##     20        1.1680             nan     0.0100    0.0029
##     40        1.0590             nan     0.0100    0.0023
##     60        0.9772             nan     0.0100    0.0014
##     80        0.9110             nan     0.0100    0.0012
##    100        0.8609             nan     0.0100    0.0007
##    120        0.8194             nan     0.0100    0.0005
##    140        0.7858             nan     0.0100    0.0003
##    160        0.7563             nan     0.0100    0.0006
##    180        0.7306             nan     0.0100    0.0001
##    200        0.7091             nan     0.0100    0.0002
##    220        0.6897             nan     0.0100    0.0003
##    240        0.6731             nan     0.0100    0.0001
##    260        0.6569             nan     0.0100    0.0001
##    280        0.6422             nan     0.0100    0.0001
##    300        0.6297             nan     0.0100    0.0001
##    320        0.6181             nan     0.0100    0.0001
##    340        0.6065             nan     0.0100    0.0001
##    360        0.5969             nan     0.0100   -0.0001
##    380        0.5868             nan     0.0100   -0.0002
##    400        0.5775             nan     0.0100    0.0001
##    420        0.5682             nan     0.0100    0.0000
##    440        0.5594             nan     0.0100   -0.0001
##    460        0.5516             nan     0.0100   -0.0000
##    480        0.5434             nan     0.0100    0.0001
##    500        0.5348             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3117             nan     0.0100    0.0045
##      2        1.3026             nan     0.0100    0.0042
##      3        1.2946             nan     0.0100    0.0039
##      4        1.2860             nan     0.0100    0.0041
##      5        1.2778             nan     0.0100    0.0037
##      6        1.2690             nan     0.0100    0.0040
##      7        1.2615             nan     0.0100    0.0036
##      8        1.2548             nan     0.0100    0.0032
##      9        1.2471             nan     0.0100    0.0034
##     10        1.2394             nan     0.0100    0.0034
##     20        1.1717             nan     0.0100    0.0029
##     40        1.0651             nan     0.0100    0.0016
##     60        0.9820             nan     0.0100    0.0012
##     80        0.9186             nan     0.0100    0.0014
##    100        0.8670             nan     0.0100    0.0010
##    120        0.8245             nan     0.0100    0.0006
##    140        0.7899             nan     0.0100    0.0006
##    160        0.7620             nan     0.0100    0.0005
##    180        0.7379             nan     0.0100    0.0003
##    200        0.7156             nan     0.0100    0.0004
##    220        0.6966             nan     0.0100    0.0001
##    240        0.6794             nan     0.0100    0.0001
##    260        0.6648             nan     0.0100    0.0001
##    280        0.6505             nan     0.0100   -0.0001
##    300        0.6372             nan     0.0100    0.0002
##    320        0.6235             nan     0.0100    0.0001
##    340        0.6121             nan     0.0100   -0.0000
##    360        0.6014             nan     0.0100    0.0002
##    380        0.5911             nan     0.0100   -0.0001
##    400        0.5819             nan     0.0100    0.0001
##    420        0.5726             nan     0.0100    0.0001
##    440        0.5634             nan     0.0100   -0.0001
##    460        0.5557             nan     0.0100    0.0000
##    480        0.5473             nan     0.0100    0.0001
##    500        0.5397             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0046
##      2        1.3014             nan     0.0100    0.0048
##      3        1.2918             nan     0.0100    0.0044
##      4        1.2833             nan     0.0100    0.0040
##      5        1.2749             nan     0.0100    0.0037
##      6        1.2664             nan     0.0100    0.0038
##      7        1.2576             nan     0.0100    0.0040
##      8        1.2487             nan     0.0100    0.0038
##      9        1.2403             nan     0.0100    0.0038
##     10        1.2323             nan     0.0100    0.0036
##     20        1.1592             nan     0.0100    0.0032
##     40        1.0439             nan     0.0100    0.0020
##     60        0.9579             nan     0.0100    0.0016
##     80        0.8900             nan     0.0100    0.0011
##    100        0.8372             nan     0.0100    0.0009
##    120        0.7932             nan     0.0100    0.0009
##    140        0.7566             nan     0.0100    0.0005
##    160        0.7257             nan     0.0100    0.0005
##    180        0.7003             nan     0.0100    0.0001
##    200        0.6765             nan     0.0100    0.0004
##    220        0.6548             nan     0.0100    0.0003
##    240        0.6359             nan     0.0100    0.0001
##    260        0.6177             nan     0.0100    0.0002
##    280        0.6018             nan     0.0100    0.0001
##    300        0.5871             nan     0.0100    0.0000
##    320        0.5737             nan     0.0100    0.0000
##    340        0.5616             nan     0.0100    0.0000
##    360        0.5493             nan     0.0100   -0.0000
##    380        0.5374             nan     0.0100   -0.0000
##    400        0.5268             nan     0.0100   -0.0000
##    420        0.5170             nan     0.0100   -0.0001
##    440        0.5071             nan     0.0100    0.0001
##    460        0.4982             nan     0.0100   -0.0000
##    480        0.4883             nan     0.0100    0.0001
##    500        0.4795             nan     0.0100   -0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0043
##      2        1.3028             nan     0.0100    0.0045
##      3        1.2932             nan     0.0100    0.0045
##      4        1.2850             nan     0.0100    0.0038
##      5        1.2759             nan     0.0100    0.0041
##      6        1.2680             nan     0.0100    0.0038
##      7        1.2594             nan     0.0100    0.0041
##      8        1.2510             nan     0.0100    0.0039
##      9        1.2425             nan     0.0100    0.0040
##     10        1.2351             nan     0.0100    0.0032
##     20        1.1608             nan     0.0100    0.0030
##     40        1.0456             nan     0.0100    0.0018
##     60        0.9587             nan     0.0100    0.0015
##     80        0.8919             nan     0.0100    0.0014
##    100        0.8383             nan     0.0100    0.0009
##    120        0.7954             nan     0.0100    0.0007
##    140        0.7582             nan     0.0100    0.0007
##    160        0.7273             nan     0.0100    0.0004
##    180        0.7019             nan     0.0100    0.0002
##    200        0.6781             nan     0.0100    0.0003
##    220        0.6590             nan     0.0100    0.0002
##    240        0.6409             nan     0.0100   -0.0000
##    260        0.6245             nan     0.0100    0.0000
##    280        0.6095             nan     0.0100    0.0000
##    300        0.5947             nan     0.0100    0.0000
##    320        0.5821             nan     0.0100    0.0001
##    340        0.5690             nan     0.0100   -0.0000
##    360        0.5576             nan     0.0100   -0.0001
##    380        0.5462             nan     0.0100   -0.0001
##    400        0.5347             nan     0.0100   -0.0000
##    420        0.5245             nan     0.0100   -0.0001
##    440        0.5142             nan     0.0100   -0.0002
##    460        0.5040             nan     0.0100   -0.0001
##    480        0.4938             nan     0.0100    0.0001
##    500        0.4853             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3115             nan     0.0100    0.0043
##      2        1.3036             nan     0.0100    0.0034
##      3        1.2947             nan     0.0100    0.0041
##      4        1.2861             nan     0.0100    0.0036
##      5        1.2774             nan     0.0100    0.0043
##      6        1.2690             nan     0.0100    0.0042
##      7        1.2605             nan     0.0100    0.0041
##      8        1.2521             nan     0.0100    0.0038
##      9        1.2445             nan     0.0100    0.0031
##     10        1.2367             nan     0.0100    0.0036
##     20        1.1644             nan     0.0100    0.0033
##     40        1.0503             nan     0.0100    0.0023
##     60        0.9639             nan     0.0100    0.0017
##     80        0.8981             nan     0.0100    0.0012
##    100        0.8440             nan     0.0100    0.0011
##    120        0.8016             nan     0.0100    0.0006
##    140        0.7649             nan     0.0100    0.0004
##    160        0.7340             nan     0.0100    0.0004
##    180        0.7088             nan     0.0100    0.0002
##    200        0.6844             nan     0.0100    0.0002
##    220        0.6655             nan     0.0100    0.0001
##    240        0.6473             nan     0.0100   -0.0000
##    260        0.6305             nan     0.0100    0.0001
##    280        0.6157             nan     0.0100    0.0002
##    300        0.6005             nan     0.0100   -0.0000
##    320        0.5867             nan     0.0100   -0.0001
##    340        0.5752             nan     0.0100    0.0000
##    360        0.5634             nan     0.0100   -0.0001
##    380        0.5515             nan     0.0100    0.0000
##    400        0.5410             nan     0.0100   -0.0000
##    420        0.5303             nan     0.0100    0.0001
##    440        0.5210             nan     0.0100    0.0001
##    460        0.5121             nan     0.0100   -0.0001
##    480        0.5024             nan     0.0100    0.0001
##    500        0.4937             nan     0.0100    0.0000
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3114             nan     0.0100    0.0047
##      2        1.3020             nan     0.0100    0.0047
##      3        1.2927             nan     0.0100    0.0043
##      4        1.2833             nan     0.0100    0.0043
##      5        1.2747             nan     0.0100    0.0037
##      6        1.2650             nan     0.0100    0.0045
##      7        1.2558             nan     0.0100    0.0039
##      8        1.2482             nan     0.0100    0.0034
##      9        1.2389             nan     0.0100    0.0041
##     10        1.2308             nan     0.0100    0.0035
##     20        1.1535             nan     0.0100    0.0032
##     40        1.0325             nan     0.0100    0.0020
##     60        0.9402             nan     0.0100    0.0018
##     80        0.8687             nan     0.0100    0.0012
##    100        0.8111             nan     0.0100    0.0011
##    120        0.7633             nan     0.0100    0.0008
##    140        0.7247             nan     0.0100    0.0006
##    160        0.6909             nan     0.0100    0.0005
##    180        0.6629             nan     0.0100    0.0005
##    200        0.6384             nan     0.0100    0.0002
##    220        0.6170             nan     0.0100    0.0003
##    240        0.5969             nan     0.0100   -0.0000
##    260        0.5802             nan     0.0100    0.0002
##    280        0.5622             nan     0.0100    0.0001
##    300        0.5470             nan     0.0100    0.0001
##    320        0.5329             nan     0.0100    0.0001
##    340        0.5189             nan     0.0100    0.0000
##    360        0.5061             nan     0.0100    0.0001
##    380        0.4944             nan     0.0100    0.0000
##    400        0.4826             nan     0.0100    0.0000
##    420        0.4720             nan     0.0100    0.0000
##    440        0.4610             nan     0.0100   -0.0001
##    460        0.4509             nan     0.0100   -0.0000
##    480        0.4407             nan     0.0100   -0.0001
##    500        0.4307             nan     0.0100   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0040
##      2        1.3033             nan     0.0100    0.0035
##      3        1.2934             nan     0.0100    0.0047
##      4        1.2845             nan     0.0100    0.0038
##      5        1.2750             nan     0.0100    0.0042
##      6        1.2661             nan     0.0100    0.0039
##      7        1.2559             nan     0.0100    0.0043
##      8        1.2473             nan     0.0100    0.0040
##      9        1.2389             nan     0.0100    0.0040
##     10        1.2309             nan     0.0100    0.0037
##     20        1.1554             nan     0.0100    0.0028
##     40        1.0370             nan     0.0100    0.0026
##     60        0.9488             nan     0.0100    0.0019
##     80        0.8780             nan     0.0100    0.0016
##    100        0.8217             nan     0.0100    0.0009
##    120        0.7750             nan     0.0100    0.0006
##    140        0.7369             nan     0.0100    0.0008
##    160        0.7039             nan     0.0100    0.0002
##    180        0.6760             nan     0.0100    0.0003
##    200        0.6518             nan     0.0100    0.0003
##    220        0.6301             nan     0.0100    0.0002
##    240        0.6096             nan     0.0100   -0.0001
##    260        0.5912             nan     0.0100    0.0001
##    280        0.5742             nan     0.0100    0.0000
##    300        0.5580             nan     0.0100    0.0000
##    320        0.5421             nan     0.0100    0.0000
##    340        0.5284             nan     0.0100    0.0000
##    360        0.5160             nan     0.0100   -0.0000
##    380        0.5032             nan     0.0100    0.0001
##    400        0.4917             nan     0.0100    0.0001
##    420        0.4808             nan     0.0100   -0.0001
##    440        0.4704             nan     0.0100    0.0000
##    460        0.4597             nan     0.0100    0.0001
##    480        0.4501             nan     0.0100   -0.0000
##    500        0.4391             nan     0.0100   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.3119             nan     0.0100    0.0044
##      2        1.3027             nan     0.0100    0.0041
##      3        1.2933             nan     0.0100    0.0045
##      4        1.2842             nan     0.0100    0.0042
##      5        1.2754             nan     0.0100    0.0039
##      6        1.2662             nan     0.0100    0.0037
##      7        1.2573             nan     0.0100    0.0043
##      8        1.2483             nan     0.0100    0.0040
##      9        1.2395             nan     0.0100    0.0036
##     10        1.2311             nan     0.0100    0.0039
##     20        1.1562             nan     0.0100    0.0034
##     40        1.0399             nan     0.0100    0.0024
##     60        0.9510             nan     0.0100    0.0017
##     80        0.8818             nan     0.0100    0.0013
##    100        0.8256             nan     0.0100    0.0011
##    120        0.7812             nan     0.0100    0.0009
##    140        0.7440             nan     0.0100    0.0008
##    160        0.7105             nan     0.0100    0.0005
##    180        0.6821             nan     0.0100    0.0002
##    200        0.6574             nan     0.0100    0.0001
##    220        0.6354             nan     0.0100    0.0000
##    240        0.6158             nan     0.0100    0.0000
##    260        0.5980             nan     0.0100    0.0000
##    280        0.5808             nan     0.0100    0.0001
##    300        0.5662             nan     0.0100   -0.0000
##    320        0.5524             nan     0.0100    0.0001
##    340        0.5394             nan     0.0100   -0.0001
##    360        0.5270             nan     0.0100   -0.0001
##    380        0.5156             nan     0.0100    0.0000
##    400        0.5049             nan     0.0100   -0.0001
##    420        0.4944             nan     0.0100   -0.0002
##    440        0.4832             nan     0.0100   -0.0001
##    460        0.4729             nan     0.0100   -0.0003
##    480        0.4634             nan     0.0100   -0.0002
##    500        0.4533             nan     0.0100    0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2353             nan     0.1000    0.0391
##      2        1.1606             nan     0.1000    0.0317
##      3        1.0978             nan     0.1000    0.0281
##      4        1.0456             nan     0.1000    0.0233
##      5        1.0038             nan     0.1000    0.0177
##      6        0.9669             nan     0.1000    0.0171
##      7        0.9352             nan     0.1000    0.0116
##      8        0.9046             nan     0.1000    0.0111
##      9        0.8770             nan     0.1000    0.0095
##     10        0.8480             nan     0.1000    0.0126
##     20        0.7049             nan     0.1000    0.0005
##     40        0.5837             nan     0.1000   -0.0007
##     60        0.5097             nan     0.1000   -0.0007
##     80        0.4429             nan     0.1000   -0.0013
##    100        0.3870             nan     0.1000    0.0001
##    120        0.3438             nan     0.1000   -0.0009
##    140        0.3103             nan     0.1000   -0.0014
##    160        0.2791             nan     0.1000    0.0000
##    180        0.2534             nan     0.1000   -0.0011
##    200        0.2256             nan     0.1000   -0.0009
##    220        0.2045             nan     0.1000   -0.0009
##    240        0.1859             nan     0.1000   -0.0001
##    260        0.1669             nan     0.1000   -0.0002
##    280        0.1517             nan     0.1000   -0.0004
##    300        0.1372             nan     0.1000   -0.0003
##    320        0.1264             nan     0.1000   -0.0002
##    340        0.1167             nan     0.1000   -0.0002
##    360        0.1061             nan     0.1000   -0.0002
##    380        0.0975             nan     0.1000   -0.0003
##    400        0.0899             nan     0.1000   -0.0001
##    420        0.0829             nan     0.1000   -0.0003
##    440        0.0769             nan     0.1000   -0.0001
##    460        0.0712             nan     0.1000   -0.0002
##    480        0.0653             nan     0.1000    0.0000
##    500        0.0603             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2375             nan     0.1000    0.0399
##      2        1.1595             nan     0.1000    0.0325
##      3        1.1030             nan     0.1000    0.0269
##      4        1.0548             nan     0.1000    0.0230
##      5        1.0085             nan     0.1000    0.0225
##      6        0.9741             nan     0.1000    0.0146
##      7        0.9410             nan     0.1000    0.0144
##      8        0.9134             nan     0.1000    0.0087
##      9        0.8853             nan     0.1000    0.0112
##     10        0.8645             nan     0.1000    0.0084
##     20        0.7108             nan     0.1000    0.0033
##     40        0.5788             nan     0.1000    0.0001
##     60        0.4987             nan     0.1000   -0.0003
##     80        0.4349             nan     0.1000   -0.0011
##    100        0.3878             nan     0.1000   -0.0011
##    120        0.3422             nan     0.1000   -0.0004
##    140        0.3096             nan     0.1000   -0.0011
##    160        0.2807             nan     0.1000    0.0003
##    180        0.2535             nan     0.1000   -0.0007
##    200        0.2304             nan     0.1000   -0.0007
##    220        0.2068             nan     0.1000   -0.0003
##    240        0.1883             nan     0.1000   -0.0011
##    260        0.1717             nan     0.1000   -0.0003
##    280        0.1568             nan     0.1000   -0.0005
##    300        0.1434             nan     0.1000   -0.0004
##    320        0.1328             nan     0.1000   -0.0003
##    340        0.1241             nan     0.1000   -0.0001
##    360        0.1148             nan     0.1000   -0.0002
##    380        0.1060             nan     0.1000   -0.0003
##    400        0.0975             nan     0.1000   -0.0002
##    420        0.0880             nan     0.1000   -0.0000
##    440        0.0808             nan     0.1000   -0.0003
##    460        0.0752             nan     0.1000   -0.0004
##    480        0.0691             nan     0.1000   -0.0001
##    500        0.0643             nan     0.1000   -0.0003
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2370             nan     0.1000    0.0387
##      2        1.1681             nan     0.1000    0.0315
##      3        1.1086             nan     0.1000    0.0276
##      4        1.0559             nan     0.1000    0.0227
##      5        1.0135             nan     0.1000    0.0195
##      6        0.9750             nan     0.1000    0.0155
##      7        0.9468             nan     0.1000    0.0091
##      8        0.9178             nan     0.1000    0.0130
##      9        0.8926             nan     0.1000    0.0113
##     10        0.8674             nan     0.1000    0.0093
##     20        0.7199             nan     0.1000    0.0020
##     40        0.5855             nan     0.1000   -0.0010
##     60        0.5058             nan     0.1000   -0.0014
##     80        0.4462             nan     0.1000   -0.0018
##    100        0.3944             nan     0.1000   -0.0012
##    120        0.3542             nan     0.1000   -0.0008
##    140        0.3202             nan     0.1000   -0.0009
##    160        0.2908             nan     0.1000   -0.0008
##    180        0.2625             nan     0.1000   -0.0004
##    200        0.2425             nan     0.1000   -0.0008
##    220        0.2244             nan     0.1000   -0.0010
##    240        0.2047             nan     0.1000   -0.0006
##    260        0.1891             nan     0.1000   -0.0006
##    280        0.1747             nan     0.1000   -0.0009
##    300        0.1615             nan     0.1000   -0.0005
##    320        0.1487             nan     0.1000   -0.0005
##    340        0.1354             nan     0.1000   -0.0007
##    360        0.1250             nan     0.1000   -0.0007
##    380        0.1162             nan     0.1000   -0.0003
##    400        0.1081             nan     0.1000   -0.0005
##    420        0.1006             nan     0.1000   -0.0004
##    440        0.0931             nan     0.1000   -0.0006
##    460        0.0863             nan     0.1000   -0.0001
##    480        0.0809             nan     0.1000   -0.0001
##    500        0.0742             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2326             nan     0.1000    0.0431
##      2        1.1602             nan     0.1000    0.0308
##      3        1.1046             nan     0.1000    0.0218
##      4        1.0478             nan     0.1000    0.0257
##      5        1.0046             nan     0.1000    0.0194
##      6        0.9625             nan     0.1000    0.0185
##      7        0.9257             nan     0.1000    0.0149
##      8        0.8919             nan     0.1000    0.0128
##      9        0.8662             nan     0.1000    0.0117
##     10        0.8390             nan     0.1000    0.0110
##     20        0.6817             nan     0.1000   -0.0007
##     40        0.5341             nan     0.1000    0.0010
##     60        0.4454             nan     0.1000   -0.0005
##     80        0.3794             nan     0.1000   -0.0002
##    100        0.3291             nan     0.1000   -0.0006
##    120        0.2870             nan     0.1000   -0.0005
##    140        0.2507             nan     0.1000   -0.0005
##    160        0.2185             nan     0.1000   -0.0004
##    180        0.1928             nan     0.1000   -0.0002
##    200        0.1712             nan     0.1000   -0.0005
##    220        0.1523             nan     0.1000   -0.0004
##    240        0.1371             nan     0.1000   -0.0008
##    260        0.1223             nan     0.1000   -0.0002
##    280        0.1098             nan     0.1000   -0.0002
##    300        0.0980             nan     0.1000   -0.0001
##    320        0.0880             nan     0.1000   -0.0003
##    340        0.0804             nan     0.1000   -0.0000
##    360        0.0727             nan     0.1000   -0.0001
##    380        0.0661             nan     0.1000   -0.0000
##    400        0.0596             nan     0.1000   -0.0002
##    420        0.0529             nan     0.1000    0.0001
##    440        0.0478             nan     0.1000   -0.0001
##    460        0.0428             nan     0.1000    0.0000
##    480        0.0385             nan     0.1000   -0.0001
##    500        0.0350             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2375             nan     0.1000    0.0352
##      2        1.1547             nan     0.1000    0.0362
##      3        1.0940             nan     0.1000    0.0279
##      4        1.0366             nan     0.1000    0.0246
##      5        0.9896             nan     0.1000    0.0187
##      6        0.9509             nan     0.1000    0.0166
##      7        0.9207             nan     0.1000    0.0122
##      8        0.8876             nan     0.1000    0.0124
##      9        0.8620             nan     0.1000    0.0083
##     10        0.8374             nan     0.1000    0.0098
##     20        0.6832             nan     0.1000    0.0026
##     40        0.5446             nan     0.1000   -0.0011
##     60        0.4563             nan     0.1000   -0.0006
##     80        0.3913             nan     0.1000   -0.0008
##    100        0.3390             nan     0.1000   -0.0005
##    120        0.2964             nan     0.1000   -0.0007
##    140        0.2608             nan     0.1000   -0.0007
##    160        0.2268             nan     0.1000   -0.0007
##    180        0.2007             nan     0.1000   -0.0009
##    200        0.1770             nan     0.1000   -0.0004
##    220        0.1558             nan     0.1000   -0.0002
##    240        0.1390             nan     0.1000   -0.0006
##    260        0.1236             nan     0.1000   -0.0010
##    280        0.1102             nan     0.1000   -0.0008
##    300        0.0994             nan     0.1000   -0.0005
##    320        0.0890             nan     0.1000   -0.0003
##    340        0.0795             nan     0.1000   -0.0002
##    360        0.0719             nan     0.1000   -0.0003
##    380        0.0643             nan     0.1000   -0.0004
##    400        0.0574             nan     0.1000   -0.0003
##    420        0.0526             nan     0.1000   -0.0003
##    440        0.0472             nan     0.1000   -0.0001
##    460        0.0427             nan     0.1000   -0.0000
##    480        0.0390             nan     0.1000   -0.0001
##    500        0.0351             nan     0.1000   -0.0002
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2333             nan     0.1000    0.0440
##      2        1.1622             nan     0.1000    0.0343
##      3        1.0991             nan     0.1000    0.0256
##      4        1.0473             nan     0.1000    0.0222
##      5        1.0025             nan     0.1000    0.0196
##      6        0.9601             nan     0.1000    0.0173
##      7        0.9255             nan     0.1000    0.0157
##      8        0.8968             nan     0.1000    0.0102
##      9        0.8702             nan     0.1000    0.0097
##     10        0.8477             nan     0.1000    0.0060
##     20        0.6818             nan     0.1000    0.0040
##     40        0.5470             nan     0.1000   -0.0013
##     60        0.4629             nan     0.1000   -0.0014
##     80        0.4006             nan     0.1000   -0.0024
##    100        0.3492             nan     0.1000   -0.0012
##    120        0.3049             nan     0.1000   -0.0013
##    140        0.2705             nan     0.1000   -0.0007
##    160        0.2415             nan     0.1000   -0.0002
##    180        0.2169             nan     0.1000   -0.0008
##    200        0.1946             nan     0.1000   -0.0006
##    220        0.1722             nan     0.1000   -0.0009
##    240        0.1532             nan     0.1000   -0.0001
##    260        0.1364             nan     0.1000   -0.0002
##    280        0.1240             nan     0.1000   -0.0004
##    300        0.1113             nan     0.1000   -0.0003
##    320        0.1005             nan     0.1000   -0.0001
##    340        0.0902             nan     0.1000   -0.0003
##    360        0.0828             nan     0.1000   -0.0002
##    380        0.0759             nan     0.1000   -0.0004
##    400        0.0691             nan     0.1000   -0.0004
##    420        0.0620             nan     0.1000   -0.0002
##    440        0.0565             nan     0.1000   -0.0002
##    460        0.0522             nan     0.1000   -0.0001
##    480        0.0473             nan     0.1000   -0.0002
##    500        0.0435             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2293             nan     0.1000    0.0420
##      2        1.1573             nan     0.1000    0.0318
##      3        1.0855             nan     0.1000    0.0292
##      4        1.0321             nan     0.1000    0.0216
##      5        0.9887             nan     0.1000    0.0177
##      6        0.9443             nan     0.1000    0.0136
##      7        0.9071             nan     0.1000    0.0154
##      8        0.8746             nan     0.1000    0.0138
##      9        0.8457             nan     0.1000    0.0106
##     10        0.8195             nan     0.1000    0.0106
##     20        0.6463             nan     0.1000    0.0043
##     40        0.4960             nan     0.1000   -0.0024
##     60        0.3977             nan     0.1000    0.0001
##     80        0.3306             nan     0.1000    0.0012
##    100        0.2779             nan     0.1000   -0.0009
##    120        0.2350             nan     0.1000   -0.0004
##    140        0.2028             nan     0.1000   -0.0005
##    160        0.1751             nan     0.1000   -0.0005
##    180        0.1494             nan     0.1000   -0.0000
##    200        0.1296             nan     0.1000   -0.0003
##    220        0.1155             nan     0.1000   -0.0003
##    240        0.1019             nan     0.1000   -0.0002
##    260        0.0899             nan     0.1000   -0.0000
##    280        0.0789             nan     0.1000   -0.0005
##    300        0.0695             nan     0.1000   -0.0002
##    320        0.0606             nan     0.1000   -0.0003
##    340        0.0536             nan     0.1000   -0.0002
##    360        0.0465             nan     0.1000   -0.0001
##    380        0.0417             nan     0.1000   -0.0002
##    400        0.0366             nan     0.1000   -0.0000
##    420        0.0321             nan     0.1000   -0.0001
##    440        0.0282             nan     0.1000   -0.0001
##    460        0.0251             nan     0.1000   -0.0002
##    480        0.0225             nan     0.1000   -0.0000
##    500        0.0201             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2276             nan     0.1000    0.0409
##      2        1.1466             nan     0.1000    0.0324
##      3        1.0798             nan     0.1000    0.0314
##      4        1.0183             nan     0.1000    0.0278
##      5        0.9667             nan     0.1000    0.0212
##      6        0.9265             nan     0.1000    0.0152
##      7        0.8925             nan     0.1000    0.0155
##      8        0.8595             nan     0.1000    0.0140
##      9        0.8350             nan     0.1000    0.0076
##     10        0.8081             nan     0.1000    0.0093
##     20        0.6393             nan     0.1000    0.0024
##     40        0.4879             nan     0.1000    0.0008
##     60        0.3943             nan     0.1000    0.0005
##     80        0.3298             nan     0.1000    0.0004
##    100        0.2823             nan     0.1000   -0.0006
##    120        0.2415             nan     0.1000   -0.0008
##    140        0.2050             nan     0.1000   -0.0009
##    160        0.1751             nan     0.1000   -0.0007
##    180        0.1517             nan     0.1000   -0.0008
##    200        0.1305             nan     0.1000   -0.0001
##    220        0.1140             nan     0.1000    0.0001
##    240        0.1000             nan     0.1000   -0.0002
##    260        0.0879             nan     0.1000   -0.0003
##    280        0.0776             nan     0.1000   -0.0003
##    300        0.0673             nan     0.1000   -0.0002
##    320        0.0597             nan     0.1000   -0.0003
##    340        0.0531             nan     0.1000   -0.0003
##    360        0.0473             nan     0.1000   -0.0002
##    380        0.0416             nan     0.1000   -0.0001
##    400        0.0365             nan     0.1000   -0.0001
##    420        0.0320             nan     0.1000   -0.0001
##    440        0.0280             nan     0.1000   -0.0001
##    460        0.0248             nan     0.1000   -0.0001
##    480        0.0219             nan     0.1000   -0.0001
##    500        0.0194             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2366             nan     0.1000    0.0385
##      2        1.1584             nan     0.1000    0.0368
##      3        1.0991             nan     0.1000    0.0273
##      4        1.0466             nan     0.1000    0.0222
##      5        0.9926             nan     0.1000    0.0247
##      6        0.9464             nan     0.1000    0.0195
##      7        0.9062             nan     0.1000    0.0151
##      8        0.8731             nan     0.1000    0.0148
##      9        0.8407             nan     0.1000    0.0122
##     10        0.8173             nan     0.1000    0.0095
##     20        0.6617             nan     0.1000    0.0025
##     40        0.5077             nan     0.1000   -0.0001
##     60        0.4201             nan     0.1000    0.0006
##     80        0.3559             nan     0.1000   -0.0011
##    100        0.3013             nan     0.1000   -0.0011
##    120        0.2559             nan     0.1000   -0.0008
##    140        0.2232             nan     0.1000   -0.0000
##    160        0.1944             nan     0.1000   -0.0004
##    180        0.1702             nan     0.1000   -0.0008
##    200        0.1483             nan     0.1000   -0.0007
##    220        0.1276             nan     0.1000   -0.0007
##    240        0.1111             nan     0.1000   -0.0002
##    260        0.0993             nan     0.1000   -0.0004
##    280        0.0867             nan     0.1000   -0.0003
##    300        0.0761             nan     0.1000   -0.0002
##    320        0.0675             nan     0.1000   -0.0002
##    340        0.0598             nan     0.1000   -0.0002
##    360        0.0535             nan     0.1000   -0.0001
##    380        0.0476             nan     0.1000   -0.0003
##    400        0.0421             nan     0.1000   -0.0002
##    420        0.0371             nan     0.1000   -0.0002
##    440        0.0327             nan     0.1000   -0.0002
##    460        0.0292             nan     0.1000   -0.0001
##    480        0.0264             nan     0.1000   -0.0002
##    500        0.0234             nan     0.1000   -0.0001
## 
## Iter   TrainDeviance   ValidDeviance   StepSize   Improve
##      1        1.2237             nan     0.1000    0.0446
##      2        1.1491             nan     0.1000    0.0353
##      3        1.0814             nan     0.1000    0.0290
##      4        1.0278             nan     0.1000    0.0217
##      5        0.9859             nan     0.1000    0.0156
##      6        0.9456             nan     0.1000    0.0164
##      7        0.9088             nan     0.1000    0.0144
##      8        0.8745             nan     0.1000    0.0144
##      9        0.8434             nan     0.1000    0.0122
##     10        0.8206             nan     0.1000    0.0090
##     20        0.6571             nan     0.1000    0.0038
##     40        0.5150             nan     0.1000   -0.0013
##     60        0.4277             nan     0.1000    0.0004
##     80        0.3509             nan     0.1000   -0.0002
##    100        0.2923             nan     0.1000    0.0003
##    120        0.2494             nan     0.1000   -0.0002
##    140        0.2124             nan     0.1000    0.0002
##    160        0.1827             nan     0.1000    0.0003
##    180        0.1574             nan     0.1000   -0.0003
##    200        0.1391             nan     0.1000   -0.0002
##    220        0.1235             nan     0.1000   -0.0003
##    240        0.1105             nan     0.1000   -0.0001
##    260        0.0973             nan     0.1000   -0.0000
##    280        0.0851             nan     0.1000   -0.0001
##    300        0.0753             nan     0.1000   -0.0002
##    320        0.0673             nan     0.1000   -0.0001
##    340        0.0593             nan     0.1000   -0.0000
##    360        0.0528             nan     0.1000   -0.0001
##    380        0.0470             nan     0.1000   -0.0001
##    400        0.0418             nan     0.1000   -0.0001
##    420        0.0372             nan     0.1000   -0.0001
##    440        0.0334             nan     0.1000    0.0000
##    460        0.0297             nan     0.1000   -0.0001
##    480        0.0266             nan     0.1000   -0.0000
##    500        0.0237             nan     0.1000   -0.0000
##################################
# Reporting the cross-validation results
# for the train set
##################################
MBS_GBM_Tune
## Stochastic Gradient Boosting 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   shrinkage  interaction.depth  n.minobsinnode  ROC        Sens       Spec     
##   0.001      4                   5              0.8910798  0.5294118  0.9513928
##   0.001      4                  10              0.8914021  0.5276471  0.9524424
##   0.001      4                  15              0.8897511  0.5211765  0.9545294
##   0.001      5                   5              0.8958001  0.5658824  0.9478871
##   0.001      5                  10              0.8958208  0.5658824  0.9478932
##   0.001      5                  15              0.8941702  0.5547059  0.9517346
##   0.001      6                   5              0.9005384  0.5864706  0.9485828
##   0.001      6                  10              0.8994397  0.5870588  0.9499832
##   0.001      6                  15              0.8974198  0.5688235  0.9496384
##   0.010      4                   5              0.9185276  0.7723529  0.9087231
##   0.010      4                  10              0.9165290  0.7705882  0.9090740
##   0.010      4                  15              0.9153323  0.7617647  0.9104683
##   0.010      5                   5              0.9237008  0.7764706  0.9094279
##   0.010      5                  10              0.9228961  0.7741176  0.9111701
##   0.010      5                  15              0.9216394  0.7682353  0.9115271
##   0.010      6                   5              0.9296301  0.7852941  0.9132662
##   0.010      6                  10              0.9281216  0.7794118  0.9146606
##   0.010      6                  15              0.9263270  0.7835294  0.9164119
##   0.100      4                   5              0.9573128  0.8964706  0.9359847
##   0.100      4                  10              0.9550397  0.8994118  0.9401892
##   0.100      4                  15              0.9547391  0.8947059  0.9450801
##   0.100      5                   5              0.9584462  0.8958824  0.9408879
##   0.100      5                  10              0.9581581  0.8976471  0.9443753
##   0.100      5                  15              0.9566970  0.8964706  0.9436949
##   0.100      6                   5              0.9599595  0.8970588  0.9405370
##   0.100      6                  10              0.9594183  0.8964706  0.9443905
##   0.100      6                  15              0.9576763  0.8917647  0.9471762
## 
## Tuning parameter 'n.trees' was held constant at a value of 500
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were n.trees = 500, interaction.depth =
##  6, shrinkage = 0.1 and n.minobsinnode = 5.
MBS_GBM_Tune$finalModel
## A gradient boosted model with bernoulli loss function.
## 500 iterations were performed.
## There were 6 predictors of which 6 had non-zero influence.
MBS_GBM_Tune$results
##    shrinkage interaction.depth n.minobsinnode n.trees       ROC      Sens
## 1      0.001                 4              5     500 0.8910798 0.5294118
## 2      0.001                 4             10     500 0.8914021 0.5276471
## 3      0.001                 4             15     500 0.8897511 0.5211765
## 10     0.010                 4              5     500 0.9185276 0.7723529
## 11     0.010                 4             10     500 0.9165290 0.7705882
## 12     0.010                 4             15     500 0.9153323 0.7617647
## 19     0.100                 4              5     500 0.9573128 0.8964706
## 20     0.100                 4             10     500 0.9550397 0.8994118
## 21     0.100                 4             15     500 0.9547391 0.8947059
## 4      0.001                 5              5     500 0.8958001 0.5658824
## 5      0.001                 5             10     500 0.8958208 0.5658824
## 6      0.001                 5             15     500 0.8941702 0.5547059
## 13     0.010                 5              5     500 0.9237008 0.7764706
## 14     0.010                 5             10     500 0.9228961 0.7741176
## 15     0.010                 5             15     500 0.9216394 0.7682353
## 22     0.100                 5              5     500 0.9584462 0.8958824
## 23     0.100                 5             10     500 0.9581581 0.8976471
## 24     0.100                 5             15     500 0.9566970 0.8964706
## 7      0.001                 6              5     500 0.9005384 0.5864706
## 8      0.001                 6             10     500 0.8994397 0.5870588
## 9      0.001                 6             15     500 0.8974198 0.5688235
## 16     0.010                 6              5     500 0.9296301 0.7852941
## 17     0.010                 6             10     500 0.9281216 0.7794118
## 18     0.010                 6             15     500 0.9263270 0.7835294
## 25     0.100                 6              5     500 0.9599595 0.8970588
## 26     0.100                 6             10     500 0.9594183 0.8964706
## 27     0.100                 6             15     500 0.9576763 0.8917647
##         Spec      ROCSD     SensSD     SpecSD
## 1  0.9513928 0.02487947 0.05615902 0.01890302
## 2  0.9524424 0.02483276 0.05028752 0.01872530
## 3  0.9545294 0.02523211 0.05077968 0.01875308
## 10 0.9087231 0.02224639 0.05438526 0.02364804
## 11 0.9090740 0.02275322 0.05567557 0.02480784
## 12 0.9104683 0.02329341 0.06048511 0.02549016
## 19 0.9359847 0.01682469 0.05293437 0.02263121
## 20 0.9401892 0.01932518 0.05610765 0.02512296
## 21 0.9450801 0.01725417 0.05141454 0.02269976
## 4  0.9478871 0.02361362 0.03962410 0.02093599
## 5  0.9478932 0.02380831 0.03799889 0.01997374
## 6  0.9517346 0.02417257 0.03948742 0.02023755
## 13 0.9094279 0.02116242 0.05599833 0.02578182
## 14 0.9111701 0.02219566 0.05803392 0.02589530
## 15 0.9115271 0.02228931 0.05441176 0.02431051
## 22 0.9408879 0.01822748 0.05435211 0.02470689
## 23 0.9443753 0.01734441 0.04977603 0.02627205
## 24 0.9436949 0.01742780 0.05361096 0.02330818
## 7  0.9485828 0.02293519 0.03921262 0.01949393
## 8  0.9499832 0.02371313 0.03842337 0.02284164
## 9  0.9496384 0.02343156 0.04043449 0.02088433
## 16 0.9132662 0.02095326 0.05147059 0.02429162
## 17 0.9146606 0.02157192 0.05164536 0.02525835
## 18 0.9164119 0.02092553 0.05494571 0.02154276
## 25 0.9405370 0.01648611 0.05164536 0.02569397
## 26 0.9443905 0.01700477 0.05310432 0.02450729
## 27 0.9471762 0.01647188 0.05248991 0.02258879
(MBS_GBM_Train_AUROC <- MBS_GBM_Tune$results[MBS_GBM_Tune$results$n.trees==MBS_GBM_Tune$bestTune$n.trees &
                                             MBS_GBM_Tune$results$shrinkage==MBS_GBM_Tune$bestTune$shrinkage &
                                             MBS_GBM_Tune$results$n.minobsinnode==MBS_GBM_Tune$bestTune$n.minobsinnode &
                                             MBS_GBM_Tune$results$interaction.depth==MBS_GBM_Tune$bestTune$interaction.depth,
                                             c("ROC")])
## [1] 0.9599595
##################################
# Identifying and plotting the
# best model predictors
##################################
MBS_GBM_VarImp <- varImp(MBS_GBM_Tune, scale = TRUE)
plot(MBS_GBM_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked VariGBMle Importance : Stochastic Gradient Boosting",
     xlGBM="Scaled Variable Importance Metrics",
     ylGBM="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
MBS_GBM_Test <- data.frame(MBS_GBM_Test_Observed = MA_Test$diagnosis,
                          MBS_GBM_Test_Predicted = predict(MBS_GBM_Tune,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

##################################
# Reporting the independent evaluation results
# for the test set
##################################
MBS_GBM_Test_ROC <- roc(response = MBS_GBM_Test$MBS_GBM_Test_Observed,
                       predictor = MBS_GBM_Test$MBS_GBM_Test_Predicted.M,
                       levels = rev(levels(MBS_GBM_Test$MBS_GBM_Test_Observed)))

(MBS_GBM_Test_AUROC <- auc(MBS_GBM_Test_ROC)[1])
## [1] 0.982562

1.5.3 Extreme Gradient Boosting (MBS_XGB)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
XGB_Grid = expand.grid(nrounds = 500,
                      max_depth = c(4,5,6),
                      eta = c(0.2,0.3,0.4),
                      gamma = c(0.1,0.01,0.001),
                      colsample_bytree = 1,
                      min_child_weight = 1,
                      subsample = 1)

##################################
# Running the extreme gradient boosting model
# by setting the caret method to 'xgbTree'
##################################
set.seed(12345678)
MBS_XGB_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                 y = MA_Train$diagnosis,
                 method = "xgbTree",
                 tuneGrid = XGB_Grid,
                 metric = "ROC",
                 trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
MBS_XGB_Tune
## eXtreme Gradient Boosting 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   eta  max_depth  gamma  ROC        Sens       Spec     
##   0.2  4          0.001  0.9577896  0.8935294  0.9426392
##   0.2  4          0.010  0.9571278  0.8941176  0.9422883
##   0.2  4          0.100  0.9565532  0.8947059  0.9433349
##   0.2  5          0.001  0.9572875  0.8964706  0.9426270
##   0.2  5          0.010  0.9574715  0.8982353  0.9384409
##   0.2  5          0.100  0.9561604  0.8964706  0.9419344
##   0.2  6          0.001  0.9579412  0.8982353  0.9419375
##   0.2  6          0.010  0.9569283  0.8958824  0.9412510
##   0.2  6          0.100  0.9571069  0.8982353  0.9401861
##   0.3  4          0.001  0.9586614  0.8994118  0.9433471
##   0.3  4          0.010  0.9577398  0.8982353  0.9419405
##   0.3  4          0.100  0.9562470  0.8994118  0.9412387
##   0.3  5          0.001  0.9589816  0.8970588  0.9405492
##   0.3  5          0.010  0.9579795  0.8958824  0.9394905
##   0.3  5          0.100  0.9561599  0.8958824  0.9433349
##   0.3  6          0.001  0.9570533  0.8970588  0.9422914
##   0.3  6          0.010  0.9557215  0.8964706  0.9415866
##   0.3  6          0.100  0.9563702  0.8988235  0.9415866
##   0.4  4          0.001  0.9580452  0.8941176  0.9450831
##   0.4  4          0.010  0.9578986  0.8935294  0.9433471
##   0.4  4          0.100  0.9559785  0.8935294  0.9426331
##   0.4  5          0.001  0.9571105  0.8917647  0.9366987
##   0.4  5          0.010  0.9571568  0.8911765  0.9384378
##   0.4  5          0.100  0.9548828  0.8958824  0.9398474
##   0.4  6          0.001  0.9578601  0.8976471  0.9377422
##   0.4  6          0.010  0.9570393  0.9000000  0.9380870
##   0.4  6          0.100  0.9544391  0.8964706  0.9394874
## 
## Tuning parameter 'nrounds' was held constant at a value of 500
## Tuning
## 
## Tuning parameter 'min_child_weight' was held constant at a value of 1
## 
## Tuning parameter 'subsample' was held constant at a value of 1
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were nrounds = 500, max_depth = 5, eta
##  = 0.3, gamma = 0.001, colsample_bytree = 1, min_child_weight = 1 and
##  subsample = 1.
MBS_XGB_Tune$finalModel
## ##### xgb.Booster
## raw: 531.8 Kb 
## call:
##   xgboost::xgb.train(params = list(eta = param$eta, max_depth = param$max_depth, 
##     gamma = param$gamma, colsample_bytree = param$colsample_bytree, 
##     min_child_weight = param$min_child_weight, subsample = param$subsample), 
##     data = x, nrounds = param$nrounds, objective = "binary:logistic")
## params (as set within xgb.train):
##   eta = "0.3", max_depth = "5", gamma = "0.001", colsample_bytree = "1", min_child_weight = "1", subsample = "1", objective = "binary:logistic", validate_parameters = "TRUE"
## xgb.attributes:
##   niter
## callbacks:
##   cb.print.evaluation(period = print_every_n)
## # of features: 6 
## niter: 500
## nfeatures : 6 
## xNames : texture_mean smoothness_mean compactness_se texture_worst smoothness_worst symmetry_worst 
## problemType : Classification 
## tuneValue :
##     nrounds max_depth eta gamma colsample_bytree min_child_weight subsample
## 13     500         5 0.3 0.001                1                1         1
## obsLevels : M B 
## param :
##  list()
MBS_XGB_Tune$results
##    eta max_depth gamma colsample_bytree min_child_weight subsample nrounds
## 1  0.2         4 0.001                1                1         1     500
## 2  0.2         4 0.010                1                1         1     500
## 3  0.2         4 0.100                1                1         1     500
## 10 0.3         4 0.001                1                1         1     500
## 11 0.3         4 0.010                1                1         1     500
## 12 0.3         4 0.100                1                1         1     500
## 19 0.4         4 0.001                1                1         1     500
## 20 0.4         4 0.010                1                1         1     500
## 21 0.4         4 0.100                1                1         1     500
## 4  0.2         5 0.001                1                1         1     500
## 5  0.2         5 0.010                1                1         1     500
## 6  0.2         5 0.100                1                1         1     500
## 13 0.3         5 0.001                1                1         1     500
## 14 0.3         5 0.010                1                1         1     500
## 15 0.3         5 0.100                1                1         1     500
## 22 0.4         5 0.001                1                1         1     500
## 23 0.4         5 0.010                1                1         1     500
## 24 0.4         5 0.100                1                1         1     500
## 7  0.2         6 0.001                1                1         1     500
## 8  0.2         6 0.010                1                1         1     500
## 9  0.2         6 0.100                1                1         1     500
## 16 0.3         6 0.001                1                1         1     500
## 17 0.3         6 0.010                1                1         1     500
## 18 0.3         6 0.100                1                1         1     500
## 25 0.4         6 0.001                1                1         1     500
## 26 0.4         6 0.010                1                1         1     500
## 27 0.4         6 0.100                1                1         1     500
##          ROC      Sens      Spec      ROCSD     SensSD     SpecSD
## 1  0.9577896 0.8935294 0.9426392 0.01687288 0.05221451 0.02278518
## 2  0.9571278 0.8941176 0.9422883 0.01680119 0.05199315 0.02465544
## 3  0.9565532 0.8947059 0.9433349 0.01807507 0.05313825 0.02178664
## 10 0.9586614 0.8994118 0.9433471 0.01770361 0.04644204 0.02428003
## 11 0.9577398 0.8982353 0.9419405 0.01744703 0.05232484 0.02477863
## 12 0.9562470 0.8994118 0.9412387 0.01711876 0.05279801 0.02353039
## 19 0.9580452 0.8941176 0.9450831 0.01815255 0.05233862 0.02081910
## 20 0.9578986 0.8935294 0.9433471 0.01856427 0.04789394 0.02100887
## 21 0.9559785 0.8935294 0.9426331 0.01912049 0.05490634 0.02360832
## 4  0.9572875 0.8964706 0.9426270 0.01809942 0.05327374 0.02092341
## 5  0.9574715 0.8982353 0.9384409 0.01855274 0.05317893 0.01921827
## 6  0.9561604 0.8964706 0.9419344 0.01741132 0.05542902 0.01875888
## 13 0.9589816 0.8970588 0.9405492 0.01887034 0.04987007 0.01807181
## 14 0.9579795 0.8958824 0.9394905 0.01858169 0.05249677 0.02024547
## 15 0.9561599 0.8958824 0.9433349 0.01773607 0.05351675 0.01793777
## 22 0.9571105 0.8917647 0.9366987 0.01917362 0.05231795 0.02404140
## 23 0.9571568 0.8911765 0.9384378 0.01975233 0.05302281 0.02355760
## 24 0.9548828 0.8958824 0.9398474 0.01984338 0.05232484 0.02505943
## 7  0.9579412 0.8982353 0.9419375 0.01852167 0.04706648 0.02263389
## 8  0.9569283 0.8958824 0.9412510 0.01840797 0.04931041 0.02182010
## 9  0.9571069 0.8982353 0.9401861 0.01786090 0.05662560 0.01908666
## 16 0.9570533 0.8970588 0.9422914 0.01803019 0.05129522 0.02147733
## 17 0.9557215 0.8964706 0.9415866 0.01881557 0.04959467 0.02071598
## 18 0.9563702 0.8988235 0.9415866 0.01801038 0.05425255 0.02129219
## 25 0.9578601 0.8976471 0.9377422 0.01829579 0.05377878 0.02572752
## 26 0.9570393 0.9000000 0.9380870 0.01744502 0.05216617 0.02223410
## 27 0.9544391 0.8964706 0.9394874 0.01859835 0.05444488 0.02178696
(MBS_XGB_Train_AUROC <- MBS_XGB_Tune$results[MBS_XGB_Tune$results$nrounds==MBS_XGB_Tune$bestTune$nrounds &
                                             MBS_XGB_Tune$results$max_depth==MBS_XGB_Tune$bestTune$max_depth &
                                             MBS_XGB_Tune$results$eta==MBS_XGB_Tune$bestTune$eta &
                                             MBS_XGB_Tune$results$gamma==MBS_XGB_Tune$bestTune$gamma &
                                             MBS_XGB_Tune$results$colsample_bytree==MBS_XGB_Tune$bestTune$colsample_bytree &
                                             MBS_XGB_Tune$results$min_child_weight==MBS_XGB_Tune$bestTune$min_child_weight &
                                             MBS_XGB_Tune$results$subsample==MBS_XGB_Tune$bestTune$subsample,
                                             c("ROC")])
## [1] 0.9589816
##################################
# Identifying and plotting the
# best model predictors
##################################
MBS_XGB_VarImp <- varImp(MBS_XGB_Tune, scale = TRUE)
plot(MBS_XGB_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked VariXGBle Importance : Extreme Gradient Boosting",
     xlXGB="Scaled Variable Importance Metrics",
     ylXGB="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
MBS_XGB_Test <- data.frame(MBS_XGB_Test_Observed = MA_Test$diagnosis,
                          MBS_XGB_Test_Predicted = predict(MBS_XGB_Tune,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

##################################
# Reporting the independent evaluation results
# for the test set
##################################
MBS_XGB_Test_ROC <- roc(response = MBS_XGB_Test$MBS_XGB_Test_Observed,
                       predictor = MBS_XGB_Test$MBS_XGB_Test_Predicted.M,
                       levels = rev(levels(MBS_XGB_Test$MBS_XGB_Test_Observed)))

(MBS_XGB_Test_AUROC <- auc(MBS_XGB_Test_ROC)[1])
## [1] 0.9830651

1.6 Model Bagging

1.6.1 Random Forest (MBG_RF)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
RF_Grid = data.frame(mtry = c(25,75,125))

##################################
# Running the random forest model
# by setting the caret method to 'rf'
##################################
set.seed(12345678)
MBG_RF_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                 y = MA_Train$diagnosis,
                 method = "rf",
                 tuneGrid = RF_Grid,
                 metric = "ROC",
                 trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
MBG_RF_Tune
## Random Forest 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   mtry  ROC        Sens       Spec     
##    25   0.9605251  0.8929412  0.9416018
##    75   0.9609714  0.8970588  0.9436949
##   125   0.9602112  0.8964706  0.9423005
## 
## ROC was used to select the optimal model using the largest value.
## The final value used for the model was mtry = 75.
MBG_RF_Tune$finalModel
## 
## Call:
##  randomForest(x = x, y = y, mtry = param$mtry) 
##                Type of random forest: classification
##                      Number of trees: 500
## No. of variables tried at each split: 6
## 
##         OOB estimate of  error rate: 4.17%
## Confusion matrix:
##     M   B class.error
## M 322  18  0.05294118
## B  20 552  0.03496503
MBG_RF_Tune$results
##   mtry       ROC      Sens      Spec      ROCSD     SensSD     SpecSD
## 1   25 0.9605251 0.8929412 0.9416018 0.01542129 0.05068021 0.02199494
## 2   75 0.9609714 0.8970588 0.9436949 0.01509025 0.05129522 0.02118903
## 3  125 0.9602112 0.8964706 0.9423005 0.01661490 0.05067309 0.02317906
(MBG_RF_Train_AUROC <- MBG_RF_Tune$results[MBG_RF_Tune$results$mtry==MBG_RF_Tune$bestTune$mtry,
                                           c("ROC")])
## [1] 0.9609714
##################################
# Identifying and plotting the
# best model predictors
##################################
MBG_RF_VarImp <- varImp(MBG_RF_Tune, scale = TRUE)
plot(MBG_RF_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked Variable Importance : Random Forest",
     xlab="Scaled Variable Importance Metrics",
     ylab="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
MBG_RF_Test <- data.frame(MBG_RF_Test_Observed = MA_Test$diagnosis,
                          MBG_RF_Test_Predicted = predict(MBG_RF_Tune,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

##################################
# Reporting the independent evaluation results
# for the test set
##################################
MBG_RF_Test_ROC <- roc(response = MBG_RF_Test$MBG_RF_Test_Observed,
                       predictor = MBG_RF_Test$MBG_RF_Test_Predicted.M,
                       levels = rev(levels(MBG_RF_Test$MBG_RF_Test_Observed)))

(MBG_RF_Test_AUROC <- auc(MBG_RF_Test_ROC)[1])
## [1] 0.9935446

1.6.2 Bagged Classification and Regression Trees (MBG_BTREE)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
# No hyperparameter tuning process required

##################################
# Running the bagged CART model
# by setting the caret method to 'treebag'
##################################
set.seed(12345678)
MBG_BTREE_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                        y = MA_Train$diagnosis,
                        method = "treebag",
                        nbagg = 50,
                        metric = "ROC",
                        trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
MBG_BTREE_Tune
## Bagged CART 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results:
## 
##   ROC       Sens       Spec     
##   0.957904  0.8976471  0.9457818
MBG_BTREE_Tune$finalModel
## 
## Bagging classification trees with 50 bootstrap replications
MBG_BTREE_Tune$results
##   parameter      ROC      Sens      Spec      ROCSD     SensSD     SpecSD
## 1      none 0.957904 0.8976471 0.9457818 0.01659441 0.04793155 0.02103826
(MBG_BTREE_Train_AUROC <- MBG_BTREE_Tune$results$ROC)
## [1] 0.957904
##################################
# Identifying and plotting the
# best model predictors
##################################
MBG_BTREE_VarImp <- varImp(MBG_BTREE_Tune, scale = TRUE)
plot(MBG_BTREE_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked Variable Importance : Bagged Classification and Regression Trees",
     xlab="Scaled Variable Importance Metrics",
     ylab="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
MBG_BTREE_Test <- data.frame(MBG_BTREE_Test_Observed = MA_Test$diagnosis,
                             MBG_BTREE_Test_Predicted = predict(MBG_BTREE_Tune,
                                                                MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                                type = "prob"))

##################################
# Reporting the independent evaluation results
# for the test set
##################################
MBG_BTREE_Test_ROC <- roc(response = MBG_BTREE_Test$MBG_BTREE_Test_Observed,
                          predictor = MBG_BTREE_Test$MBG_BTREE_Test_Predicted.M,
                          levels = rev(levels(MBG_BTREE_Test$MBG_BTREE_Test_Observed)))

(MBG_BTREE_Test_AUROC <- auc(MBG_BTREE_Test_ROC)[1])
## [1] 0.9928739

1.7 Model Ensemble and Stacking

1.7.1 Base Learner Model Development using Linear Discriminant Analysis (BAL_LDA)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
# No hyperparameter tuning process required

##################################
# Running the linear discriminant analysis model
# by setting the caret method to 'lda'
##################################
set.seed(12345678)
BAL_LDA_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                        y = MA_Train$diagnosis,
                        method = "lda",
                        preProc = c("center","scale"),
                        metric = "ROC",
                        trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
BAL_LDA_Tune
## Linear Discriminant Analysis 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results:
## 
##   ROC        Sens       Spec     
##   0.8736974  0.6988235  0.8831976
BAL_LDA_Tune$finalModel
## Call:
## lda(x, y)
## 
## Prior probabilities of groups:
##        M        B 
## 0.372807 0.627193 
## 
## Group means:
##   texture_mean smoothness_mean compactness_se texture_worst smoothness_worst
## M    0.5472116       0.4765617      0.4590688     0.5971745        0.5506486
## B   -0.3252656      -0.2832710     -0.2728731    -0.3549639       -0.3273086
##   symmetry_worst
## M      0.4949332
## B     -0.2941911
## 
## Coefficients of linear discriminants:
##                         LD1
## texture_mean     -0.5101493
## smoothness_mean  -0.2598153
## compactness_se   -0.2404049
## texture_worst    -0.2745054
## smoothness_worst -0.3118616
## symmetry_worst   -0.3480006
BAL_LDA_Tune$results
##   parameter       ROC      Sens      Spec      ROCSD     SensSD    SpecSD
## 1      none 0.8736974 0.6988235 0.8831976 0.02765351 0.05043067 0.0340081
(BAL_LDA_Train_AUROC <- BAL_LDA_Tune$results$ROC)
## [1] 0.8736974
##################################
# Identifying and plotting the
# best model predictors
##################################
BAL_LDA_VarImp <- varImp(BAL_LDA_Tune, scale = TRUE)
plot(BAL_LDA_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked Variable Importance : Linear Discriminant Analysis",
     xlab="Scaled Variable Importance Metrics",
     ylab="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
BAL_LDA_Test <- data.frame(BAL_LDA_Test_Observed = MA_Test$diagnosis,
                           BAL_LDA_Test_Predicted = predict(BAL_LDA_Tune,
                                                            MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                            type = "prob"))

BAL_LDA_Test
##      BAL_LDA_Test_Observed BAL_LDA_Test_Predicted.M BAL_LDA_Test_Predicted.B
## 8                        M              0.847694277               0.15230572
## 13                       M              0.752143565               0.24785644
## 16                       M              0.984315261               0.01568474
## 19                       M              0.648181426               0.35181857
## 24                       M              0.622861398               0.37713860
## 31                       M              0.917386313               0.08261369
## 37                       M              0.793173364               0.20682664
## 42                       M              0.907919179               0.09208082
## 44                       M              0.764230707               0.23576929
## 46                       M              0.728618101               0.27138190
## 48                       M              0.859933443               0.14006656
## 54                       M              0.491166745               0.50883326
## 61                       B              0.156553430               0.84344657
## 65                       M              0.944349947               0.05565005
## 69                       B              0.781343429               0.21865657
## 70                       B              0.048409216               0.95159078
## 71                       M              0.248016285               0.75198371
## 72                       B              0.071364929               0.92863507
## 82                       B              0.539918492               0.46008151
## 89                       B              0.527909798               0.47209020
## 93                       B              0.007363099               0.99263690
## 98                       B              0.155923262               0.84407674
## 103                      B              0.143938966               0.85606103
## 104                      B              0.481557196               0.51844280
## 107                      B              0.675382693               0.32461731
## 109                      M              0.958051423               0.04194858
## 116                      B              0.481356202               0.51864380
## 123                      M              0.935249986               0.06475001
## 125                      B              0.021730402               0.97826960
## 129                      B              0.262831191               0.73716881
## 130                      M              0.836436129               0.16356387
## 136                      M              0.544334142               0.45566586
## 139                      M              0.567922485               0.43207751
## 147                      M              0.805510959               0.19448904
## 149                      B              0.119929629               0.88007037
## 150                      B              0.035624393               0.96437561
## 162                      M              0.048283884               0.95171612
## 167                      B              0.003383242               0.99661676
## 185                      M              0.494169519               0.50583048
## 191                      M              0.984766695               0.01523330
## 194                      M              0.965697843               0.03430216
## 197                      M              0.923211780               0.07678822
## 200                      M              0.829831914               0.17016809
## 201                      B              0.397595099               0.60240490
## 209                      B              0.832870585               0.16712942
## 211                      M              0.490217911               0.50978209
## 213                      M              0.051277757               0.94872224
## 217                      B              0.578278588               0.42172141
## 221                      B              0.028330155               0.97166984
## 222                      B              0.127351126               0.87264887
## 223                      B              0.321834029               0.67816597
## 228                      B              0.080646732               0.91935327
## 237                      M              0.878798075               0.12120193
## 239                      B              0.386770703               0.61322930
## 242                      B              0.017488284               0.98251172
## 246                      B              0.572817303               0.42718270
## 256                      M              0.450075130               0.54992487
## 259                      M              0.910731982               0.08926802
## 262                      M              0.300394459               0.69960554
## 263                      M              0.556262598               0.44373740
## 266                      M              0.914981868               0.08501813
## 272                      B              0.037223108               0.96277689
## 274                      B              0.115142932               0.88485707
## 275                      M              0.477067718               0.52293228
## 285                      B              0.017755227               0.98224477
## 300                      B              0.221001328               0.77899867
## 308                      B              0.009577361               0.99042264
## 328                      B              0.017205318               0.98279468
## 345                      B              0.152736274               0.84726373
## 349                      B              0.125805661               0.87419434
## 356                      B              0.100082662               0.89991734
## 363                      B              0.279268326               0.72073167
## 365                      B              0.067382598               0.93261740
## 368                      B              0.289130560               0.71086944
## 382                      B              0.059008473               0.94099153
## 383                      B              0.099940629               0.90005937
## 387                      B              0.029501526               0.97049847
## 388                      B              0.011594911               0.98840509
## 401                      M              0.916015243               0.08398476
## 403                      B              0.121740628               0.87825937
## 417                      B              0.688697152               0.31130285
## 420                      B              0.438908267               0.56109173
## 428                      B              0.538028898               0.46197110
## 434                      M              0.754075514               0.24592449
## 442                      M              0.720665044               0.27933496
## 444                      B              0.060993346               0.93900665
## 445                      M              0.127952064               0.87204794
## 454                      B              0.063780108               0.93621989
## 455                      B              0.116970237               0.88302976
## 460                      B              0.398746415               0.60125359
## 462                      M              0.848632474               0.15136753
## 463                      B              0.214670122               0.78532988
## 472                      B              0.424438993               0.57556101
## 484                      B              0.099533312               0.90046669
## 489                      B              0.266301333               0.73369867
## 493                      M              0.569012819               0.43098718
## 494                      B              0.003052093               0.99694791
## 497                      B              0.533620893               0.46637911
## 498                      B              0.182516881               0.81748312
## 501                      B              0.056152054               0.94384795
## 502                      M              0.965124814               0.03487519
## 507                      B              0.469228787               0.53077121
## 509                      B              0.047121869               0.95287813
## 525                      B              0.140788342               0.85921166
## 526                      B              0.147559515               0.85244048
## 527                      B              0.638000064               0.36199994
## 531                      B              0.288554915               0.71144509
## 532                      B              0.651049032               0.34895097
## 534                      M              0.415894843               0.58410516
## 537                      M              0.705945640               0.29405436
## 544                      B              0.493009474               0.50699053
## 546                      B              0.442267183               0.55773282
## 548                      B              0.210657721               0.78934228
## 550                      B              0.511181378               0.48881862
## 551                      B              0.067861362               0.93213864
## 556                      B              0.672400732               0.32759927
## 557                      B              0.194179282               0.80582072
## 575                      M              0.804848412               0.19515159
## 578                      M              0.961876926               0.03812307
## 581                      M              0.641329989               0.35867001
## 583                      M              0.428761786               0.57123821
## 589                      B              0.116915613               0.88308439
## 590                      B              0.225642380               0.77435762
## 601                      M              0.892598780               0.10740122
## 603                      M              0.904195640               0.09580436
## 611                      M              0.907919179               0.09208082
## 617                      M              0.859933443               0.14006656
## 619                      B              0.375773201               0.62422680
## 625                      B              0.221742438               0.77825756
## 628                      B              0.043025434               0.95697457
## 632                      M              0.860984905               0.13901509
## 646                      B              0.055755243               0.94424476
## 649                      B              0.248132208               0.75186779
## 657                      M              0.747532205               0.25246780
## 662                      B              0.007363099               0.99263690
## 665                      M              0.729748372               0.27025163
## 677                      B              0.169948174               0.83005183
## 679                      B              0.579052997               0.42094700
## 685                      B              0.481356202               0.51864380
## 687                      M              0.797009318               0.20299068
## 689                      M              0.379875103               0.62012490
## 695                      B              0.052838610               0.94716139
## 701                      M              0.524264466               0.47573553
## 704                      M              0.632072177               0.36792782
## 706                      B              0.051096277               0.94890372
## 709                      B              0.055780027               0.94421997
## 715                      B              0.209980973               0.79001903
## 726                      M              0.588590272               0.41140973
## 734                      M              0.556955275               0.44304472
## 747                      M              0.650003433               0.34999657
## 752                      M              0.696270542               0.30372946
## 763                      M              0.965697843               0.03430216
## 765                      B              0.068697986               0.93130201
## 775                      M              0.266641278               0.73335872
## 780                      M              0.490217911               0.50978209
## 786                      B              0.578278588               0.42172141
## 792                      B              0.321834029               0.67816597
## 796                      B              0.081823993               0.91817601
## 809                      M              0.971682884               0.02831712
## 813                      B              0.213421626               0.78657837
## 816                      B              0.079525623               0.92047438
## 818                      B              0.817347399               0.18265260
## 820                      M              0.681962716               0.31803728
## 823                      M              0.336004147               0.66399585
## 850                      M              0.944642119               0.05535788
## 854                      B              0.017755227               0.98224477
## 865                      B              0.021146573               0.97885343
## 867                      M              0.051514297               0.94848570
## 870                      M              0.652912941               0.34708706
## 876                      B              0.033179561               0.96682044
## 882                      B              0.029474840               0.97052516
## 886                      B              0.004334774               0.99566523
## 895                      B              0.157840339               0.84215966
## 896                      B              0.013873786               0.98612621
## 905                      M              0.647891355               0.35210865
## 906                      B              0.032661361               0.96733864
## 913                      M              0.809506116               0.19049388
## 917                      B              0.063686686               0.93631331
## 919                      B              0.196123444               0.80387656
## 922                      M              0.720066525               0.27993347
## 923                      M              0.853122900               0.14687710
## 925                      B              0.100082662               0.89991734
## 928                      B              0.027529187               0.97247081
## 932                      B              0.279268326               0.72073167
## 936                      M              0.883947846               0.11605215
## 941                      B              0.010656141               0.98934386
## 950                      B              0.303759143               0.69624086
## 953                      B              0.403505824               0.59649418
## 956                      B              0.029501526               0.97049847
## 967                      B              0.026289767               0.97371023
## 973                      B              0.164651800               0.83534820
## 974                      B              0.011316735               0.98868327
## 976                      B              0.059212838               0.94078716
## 980                      B              0.367183351               0.63281665
## 985                      B              0.640716764               0.35928324
## 987                      M              0.778924037               0.22107596
## 993                      B              0.238399926               0.76160007
## 1010                     B              0.293568883               0.70643112
## 1015                     B              0.640576383               0.35942362
## 1023                     B              0.063780108               0.93621989
## 1025                     B              0.634929643               0.36507036
## 1030                     M              0.893535322               0.10646468
## 1034                     B              0.062032839               0.93796716
## 1043                     B              0.306855621               0.69314438
## 1045                     B              0.119438841               0.88056116
## 1046                     B              0.281449575               0.71855042
## 1047                     B              0.020135949               0.97986405
## 1052                     B              0.140996234               0.85900377
## 1060                     B              0.423288399               0.57671160
## 1061                     B              0.002198833               0.99780117
## 1066                     B              0.533620893               0.46637911
## 1068                     M              0.279349586               0.72065041
## 1071                     M              0.965124814               0.03487519
## 1072                     B              0.409662210               0.59033779
## 1077                     B              0.473035808               0.52696419
## 1090                     B              0.596599276               0.40340072
## 1094                     B              0.140788342               0.85921166
## 1097                     B              0.034350449               0.96564955
## 1098                     B              0.067500501               0.93249950
## 1105                     M              0.539179210               0.46082079
## 1106                     M              0.705945640               0.29405436
## 1109                     B              0.851517966               0.14848203
## 1113                     B              0.493009474               0.50699053
## 1129                     B              0.516810172               0.48318983
## 1136                     M              0.536782216               0.46321778
## 1138                     B              0.044202476               0.95579752
##################################
# Reporting the independent evaluation results
# for the test set
##################################
BAL_LDA_Test_ROC <- roc(response = BAL_LDA_Test$BAL_LDA_Test_Observed,
                        predictor = BAL_LDA_Test$BAL_LDA_Test_Predicted.M,
                        levels = rev(levels(BAL_LDA_Test$BAL_LDA_Test_Observed)))

(BAL_LDA_Test_AUROC <- auc(BAL_LDA_Test_ROC)[1])
## [1] 0.8984742

1.7.2 Base Learner Model Development using Classification and Regression Trees (BAL_CART)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
CART_Grid = data.frame(cp = c(0.001, 0.005, 0.010, 0.015, 0.020))

##################################
# Running the classification and regression tree model
# by setting the caret method to 'rpart'
##################################
set.seed(12345678)
BAL_CART_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                       y = MA_Train$diagnosis,
                       method = "rpart",
                       tuneGrid = CART_Grid,
                       metric = "ROC",
                       trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
BAL_CART_Tune
## CART 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   cp     ROC        Sens       Spec     
##   0.001  0.8699967  0.7600000  0.8688787
##   0.005  0.8552277  0.7570588  0.8692357
##   0.010  0.8414964  0.7535294  0.8699252
##   0.015  0.8326147  0.7429412  0.8734249
##   0.020  0.8298568  0.7323529  0.8744744
## 
## ROC was used to select the optimal model using the largest value.
## The final value used for the model was cp = 0.001.
BAL_CART_Tune$finalModel
## n= 912 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##   1) root 912 340 B (0.37280702 0.62719298)  
##     2) texture_mean>=2.927988 473 196 M (0.58562368 0.41437632)  
##       4) smoothness_worst>=-1.52382 252  45 M (0.82142857 0.17857143)  
##         8) symmetry_worst>=-1.660064 150  15 M (0.90000000 0.10000000)  
##          16) smoothness_worst>=-1.471923 114   5 M (0.95614035 0.04385965) *
##          17) smoothness_worst< -1.471923 36  10 M (0.72222222 0.27777778)  
##            34) texture_mean>=3.007903 27   4 M (0.85185185 0.14814815) *
##            35) texture_mean< 3.007903 9   3 B (0.33333333 0.66666667) *
##         9) symmetry_worst< -1.660064 102  30 M (0.70588235 0.29411765)  
##          18) texture_mean>=3.079152 51   9 M (0.82352941 0.17647059) *
##          19) texture_mean< 3.079152 51  21 M (0.58823529 0.41176471)  
##            38) compactness_se>=-3.816486 28   6 M (0.78571429 0.21428571)  
##              76) texture_mean< 3.011847 17   0 M (1.00000000 0.00000000) *
##              77) texture_mean>=3.011847 11   5 B (0.45454545 0.54545455) *
##            39) compactness_se< -3.816486 23   8 B (0.34782609 0.65217391)  
##              78) smoothness_mean< -2.312955 10   4 M (0.60000000 0.40000000) *
##              79) smoothness_mean>=-2.312955 13   2 B (0.15384615 0.84615385) *
##       5) smoothness_worst< -1.52382 221  70 B (0.31674208 0.68325792)  
##        10) smoothness_mean>=-2.425944 88  43 M (0.51136364 0.48863636)  
##          20) texture_worst>=4.527762 58  18 M (0.68965517 0.31034483)  
##            40) symmetry_worst>=-1.966444 35   5 M (0.85714286 0.14285714) *
##            41) symmetry_worst< -1.966444 23  10 B (0.43478261 0.56521739)  
##              82) texture_worst< 4.804356 13   5 M (0.61538462 0.38461538) *
##              83) texture_worst>=4.804356 10   2 B (0.20000000 0.80000000) *
##          21) texture_worst< 4.527762 30   5 B (0.16666667 0.83333333) *
##        11) smoothness_mean< -2.425944 133  25 B (0.18796992 0.81203008)  
##          22) symmetry_worst>=-1.496954 8   2 M (0.75000000 0.25000000) *
##          23) symmetry_worst< -1.496954 125  19 B (0.15200000 0.84800000)  
##            46) symmetry_worst>=-1.695215 24   8 B (0.33333333 0.66666667)  
##              92) compactness_se< -4.114414 13   5 M (0.61538462 0.38461538) *
##              93) compactness_se>=-4.114414 11   0 B (0.00000000 1.00000000) *
##            47) symmetry_worst< -1.695215 101  11 B (0.10891089 0.89108911)  
##              94) compactness_se>=-3.611049 28   8 B (0.28571429 0.71428571)  
##               188) compactness_se< -3.197021 9   3 M (0.66666667 0.33333333) *
##               189) compactness_se>=-3.197021 19   2 B (0.10526316 0.89473684) *
##              95) compactness_se< -3.611049 73   3 B (0.04109589 0.95890411) *
##     3) texture_mean< 2.927988 439  63 B (0.14350797 0.85649203)  
##       6) symmetry_worst>=-1.325507 25   4 M (0.84000000 0.16000000) *
##       7) symmetry_worst< -1.325507 414  42 B (0.10144928 0.89855072)  
##        14) compactness_se>=-3.970723 153  33 B (0.21568627 0.78431373)  
##          28) smoothness_worst>=-1.451541 30  13 M (0.56666667 0.43333333)  
##            56) symmetry_worst>=-1.619683 14   1 M (0.92857143 0.07142857) *
##            57) symmetry_worst< -1.619683 16   4 B (0.25000000 0.75000000) *
##          29) smoothness_worst< -1.451541 123  16 B (0.13008130 0.86991870)  
##            58) compactness_se< -3.427747 70  15 B (0.21428571 0.78571429)  
##             116) compactness_se>=-3.705619 28  11 B (0.39285714 0.60714286)  
##               232) smoothness_mean>=-2.388266 17   7 M (0.58823529 0.41176471) *
##               233) smoothness_mean< -2.388266 11   1 B (0.09090909 0.90909091) *
##             117) compactness_se< -3.705619 42   4 B (0.09523810 0.90476190) *
##            59) compactness_se>=-3.427747 53   1 B (0.01886792 0.98113208) *
##        15) compactness_se< -3.970723 261   9 B (0.03448276 0.96551724) *
BAL_CART_Tune$results
##      cp       ROC      Sens      Spec      ROCSD     SensSD     SpecSD
## 1 0.001 0.8699967 0.7600000 0.8688787 0.02958365 0.05611407 0.02331264
## 2 0.005 0.8552277 0.7570588 0.8692357 0.02789001 0.05975368 0.02778803
## 3 0.010 0.8414964 0.7535294 0.8699252 0.02430214 0.05134438 0.02773026
## 4 0.015 0.8326147 0.7429412 0.8734249 0.02618044 0.05304320 0.01922145
## 5 0.020 0.8298568 0.7323529 0.8744744 0.02663895 0.06018642 0.02293389
(BAL_CART_Train_AUROC <- BAL_CART_Tune$results[BAL_CART_Tune$results$cp==BAL_CART_Tune$bestTune$cp,
                                                     c("ROC")])
## [1] 0.8699967
##################################
# Identifying and plotting the
# best model predictors
##################################
BAL_CART_VarImp <- varImp(BAL_CART_Tune, scale = TRUE)
plot(BAL_CART_VarImp,
     top=6,
     scales=list(y=list(cex = .95)),
     main="Ranked Variable Importance : Classification and Regression Trees",
     xlab="Scaled Variable Importance Metrics",
     ylab="Predictors",
     cex=2,
     origin=0,
     alpha=0.45)

##################################
# Independently evaluating the model
# on the test set
##################################
BAL_CART_Test <- data.frame(BAL_CART_Test_Observed = MA_Test$diagnosis,
                            BAL_CART_Test_Predicted = predict(BAL_CART_Tune,
                                                              MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                              type = "prob"))

BAL_CART_Test
##      BAL_CART_Test_Observed BAL_CART_Test_Predicted.M BAL_CART_Test_Predicted.B
## 8                         M                0.95614035                0.04385965
## 13                        M                0.85714286                0.14285714
## 16                        M                0.95614035                0.04385965
## 19                        M                0.82352941                0.17647059
## 24                        M                0.82352941                0.17647059
## 31                        M                0.95614035                0.04385965
## 37                        M                0.95614035                0.04385965
## 42                        M                0.15384615                0.84615385
## 44                        M                0.95614035                0.04385965
## 46                        M                0.84000000                0.16000000
## 48                        M                0.84000000                0.16000000
## 54                        M                0.16666667                0.83333333
## 61                        B                0.03448276                0.96551724
## 65                        M                0.95614035                0.04385965
## 69                        B                0.84000000                0.16000000
## 70                        B                0.03448276                0.96551724
## 71                        M                0.85714286                0.14285714
## 72                        B                0.01886792                0.98113208
## 82                        B                0.92857143                0.07142857
## 89                        B                0.00000000                1.00000000
## 93                        B                0.03448276                0.96551724
## 98                        B                0.15384615                0.84615385
## 103                       B                0.04109589                0.95890411
## 104                       B                0.15384615                0.84615385
## 107                       B                0.25000000                0.75000000
## 109                       M                0.95614035                0.04385965
## 116                       B                0.45454545                0.54545455
## 123                       M                0.95614035                0.04385965
## 125                       B                0.01886792                0.98113208
## 129                       B                0.01886792                0.98113208
## 130                       M                0.85714286                0.14285714
## 136                       M                0.82352941                0.17647059
## 139                       M                0.01886792                0.98113208
## 147                       M                0.84000000                0.16000000
## 149                       B                0.09523810                0.90476190
## 150                       B                0.03448276                0.96551724
## 162                       M                0.09090909                0.90909091
## 167                       B                0.03448276                0.96551724
## 185                       M                0.85714286                0.14285714
## 191                       M                0.95614035                0.04385965
## 194                       M                0.95614035                0.04385965
## 197                       M                0.95614035                0.04385965
## 200                       M                0.95614035                0.04385965
## 201                       B                0.60000000                0.40000000
## 209                       B                0.85185185                0.14814815
## 211                       M                0.85714286                0.14285714
## 213                       M                0.58823529                0.41176471
## 217                       B                0.01886792                0.98113208
## 221                       B                0.03448276                0.96551724
## 222                       B                0.09523810                0.90476190
## 223                       B                0.03448276                0.96551724
## 228                       B                0.09090909                0.90909091
## 237                       M                0.95614035                0.04385965
## 239                       B                0.66666667                0.33333333
## 242                       B                0.03448276                0.96551724
## 246                       B                0.15384615                0.84615385
## 256                       M                0.92857143                0.07142857
## 259                       M                0.95614035                0.04385965
## 262                       M                0.04109589                0.95890411
## 263                       M                0.85714286                0.14285714
## 266                       M                0.82352941                0.17647059
## 272                       B                0.03448276                0.96551724
## 274                       B                0.03448276                0.96551724
## 275                       M                0.82352941                0.17647059
## 285                       B                0.01886792                0.98113208
## 300                       B                0.16666667                0.83333333
## 308                       B                0.03448276                0.96551724
## 328                       B                0.03448276                0.96551724
## 345                       B                0.03448276                0.96551724
## 349                       B                0.03448276                0.96551724
## 356                       B                0.10526316                0.89473684
## 363                       B                0.85714286                0.14285714
## 365                       B                0.03448276                0.96551724
## 368                       B                0.03448276                0.96551724
## 382                       B                0.09523810                0.90476190
## 383                       B                0.10526316                0.89473684
## 387                       B                0.09090909                0.90909091
## 388                       B                0.03448276                0.96551724
## 401                       M                0.95614035                0.04385965
## 403                       B                0.01886792                0.98113208
## 417                       B                0.15384615                0.84615385
## 420                       B                0.85714286                0.14285714
## 428                       B                0.00000000                1.00000000
## 434                       M                0.85185185                0.14814815
## 442                       M                0.82352941                0.17647059
## 444                       B                0.03448276                0.96551724
## 445                       M                0.03448276                0.96551724
## 454                       B                0.03448276                0.96551724
## 455                       B                0.03448276                0.96551724
## 460                       B                0.04109589                0.95890411
## 462                       M                0.82352941                0.17647059
## 463                       B                0.04109589                0.95890411
## 472                       B                0.04109589                0.95890411
## 484                       B                0.03448276                0.96551724
## 489                       B                0.03448276                0.96551724
## 493                       M                0.85185185                0.14814815
## 494                       B                0.03448276                0.96551724
## 497                       B                0.92857143                0.07142857
## 498                       B                0.03448276                0.96551724
## 501                       B                0.03448276                0.96551724
## 502                       M                0.95614035                0.04385965
## 507                       B                0.15384615                0.84615385
## 509                       B                0.03448276                0.96551724
## 525                       B                0.09523810                0.90476190
## 526                       B                0.03448276                0.96551724
## 527                       B                0.95614035                0.04385965
## 531                       B                0.03448276                0.96551724
## 532                       B                0.95614035                0.04385965
## 534                       M                0.85714286                0.14285714
## 537                       M                0.82352941                0.17647059
## 544                       B                0.04109589                0.95890411
## 546                       B                0.85714286                0.14285714
## 548                       B                0.09090909                0.90909091
## 550                       B                0.00000000                1.00000000
## 551                       B                0.04109589                0.95890411
## 556                       B                0.82352941                0.17647059
## 557                       B                0.16666667                0.83333333
## 575                       M                0.84000000                0.16000000
## 578                       M                0.95614035                0.04385965
## 581                       M                0.84000000                0.16000000
## 583                       M                0.66666667                0.33333333
## 589                       B                0.03448276                0.96551724
## 590                       B                0.09523810                0.90476190
## 601                       M                0.95614035                0.04385965
## 603                       M                0.95614035                0.04385965
## 611                       M                0.15384615                0.84615385
## 617                       M                0.84000000                0.16000000
## 619                       B                0.04109589                0.95890411
## 625                       B                0.16666667                0.83333333
## 628                       B                0.04109589                0.95890411
## 632                       M                0.82352941                0.17647059
## 646                       B                0.58823529                0.41176471
## 649                       B                0.03448276                0.96551724
## 657                       M                0.85714286                0.14285714
## 662                       B                0.03448276                0.96551724
## 665                       M                0.85714286                0.14285714
## 677                       B                0.03448276                0.96551724
## 679                       B                0.60000000                0.40000000
## 685                       B                0.45454545                0.54545455
## 687                       M                0.92857143                0.07142857
## 689                       M                0.75000000                0.25000000
## 695                       B                0.03448276                0.96551724
## 701                       M                0.15384615                0.84615385
## 704                       M                0.95614035                0.04385965
## 706                       B                0.03448276                0.96551724
## 709                       B                0.01886792                0.98113208
## 715                       B                0.01886792                0.98113208
## 726                       M                0.45454545                0.54545455
## 734                       M                0.75000000                0.25000000
## 747                       M                0.33333333                0.66666667
## 752                       M                0.85185185                0.14814815
## 763                       M                0.95614035                0.04385965
## 765                       B                0.03448276                0.96551724
## 775                       M                0.92857143                0.07142857
## 780                       M                0.85714286                0.14285714
## 786                       B                0.01886792                0.98113208
## 792                       B                0.03448276                0.96551724
## 796                       B                0.03448276                0.96551724
## 809                       M                0.82352941                0.17647059
## 813                       B                0.04109589                0.95890411
## 816                       B                0.03448276                0.96551724
## 818                       B                0.95614035                0.04385965
## 820                       M                0.85714286                0.14285714
## 823                       M                0.03448276                0.96551724
## 850                       M                0.95614035                0.04385965
## 854                       B                0.01886792                0.98113208
## 865                       B                0.03448276                0.96551724
## 867                       M                0.03448276                0.96551724
## 870                       M                1.00000000                0.00000000
## 876                       B                0.03448276                0.96551724
## 882                       B                0.09090909                0.90909091
## 886                       B                0.03448276                0.96551724
## 895                       B                0.03448276                0.96551724
## 896                       B                0.03448276                0.96551724
## 905                       M                0.15384615                0.84615385
## 906                       B                0.09523810                0.90476190
## 913                       M                0.85714286                0.14285714
## 917                       B                0.03448276                0.96551724
## 919                       B                0.09523810                0.90476190
## 922                       M                0.84000000                0.16000000
## 923                       M                0.82352941                0.17647059
## 925                       B                0.10526316                0.89473684
## 928                       B                0.09090909                0.90909091
## 932                       B                0.85714286                0.14285714
## 936                       M                0.85714286                0.14285714
## 941                       B                0.03448276                0.96551724
## 950                       B                0.03448276                0.96551724
## 953                       B                0.01886792                0.98113208
## 956                       B                0.09090909                0.90909091
## 967                       B                0.01886792                0.98113208
## 973                       B                0.03448276                0.96551724
## 974                       B                0.03448276                0.96551724
## 976                       B                0.03448276                0.96551724
## 980                       B                0.03448276                0.96551724
## 985                       B                0.85185185                0.14814815
## 987                       M                0.95614035                0.04385965
## 993                       B                0.85714286                0.14285714
## 1010                      B                0.01886792                0.98113208
## 1015                      B                0.82352941                0.17647059
## 1023                      B                0.03448276                0.96551724
## 1025                      B                0.20000000                0.80000000
## 1030                      M                0.95614035                0.04385965
## 1034                      B                0.03448276                0.96551724
## 1043                      B                0.04109589                0.95890411
## 1045                      B                0.03448276                0.96551724
## 1046                      B                0.61538462                0.38461538
## 1047                      B                0.03448276                0.96551724
## 1052                      B                0.03448276                0.96551724
## 1060                      B                0.61538462                0.38461538
## 1061                      B                0.03448276                0.96551724
## 1066                      B                0.92857143                0.07142857
## 1068                      M                0.58823529                0.41176471
## 1071                      M                0.95614035                0.04385965
## 1072                      B                0.03448276                0.96551724
## 1077                      B                0.25000000                0.75000000
## 1090                      B                0.84000000                0.16000000
## 1094                      B                0.09523810                0.90476190
## 1097                      B                0.03448276                0.96551724
## 1098                      B                0.01886792                0.98113208
## 1105                      M                0.85714286                0.14285714
## 1106                      M                0.82352941                0.17647059
## 1109                      B                0.82352941                0.17647059
## 1113                      B                0.04109589                0.95890411
## 1129                      B                0.20000000                0.80000000
## 1136                      M                0.66666667                0.33333333
## 1138                      B                0.04109589                0.95890411
##################################
# Reporting the independent evaluation results
# for the test set
##################################
BAL_CART_Test_ROC <- roc(response = BAL_CART_Test$BAL_CART_Test_Observed,
                         predictor = BAL_CART_Test$BAL_CART_Test_Predicted.M,
                         levels = rev(levels(BAL_CART_Test$BAL_CART_Test_Observed)))

(BAL_CART_Test_AUROC <- auc(BAL_CART_Test_ROC)[1])
## [1] 0.8843478

1.7.3 Base Learner Model Development using Support Vector Machine - Radial Basis Function Kernel (BAL_SVM_R)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
# used a range of default values

##################################
# Running the support vector machine model
# by setting the caret method to 'svmRadial'
##################################
set.seed(12345678)
BAL_SVM_R_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                        y = MA_Train$diagnosis,
                        method = "svmRadial",
                        preProc = c("center", "scale"),
                        tuneLength = 14,
                        metric = "ROC",
                        trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
BAL_SVM_R_Tune
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   C        ROC        Sens       Spec     
##      0.25  0.8785695  0.7035294  0.9108375
##      0.50  0.8841244  0.7217647  0.9090831
##      1.00  0.8901422  0.7335294  0.9048848
##      2.00  0.8994429  0.7364706  0.9052326
##      4.00  0.9053698  0.7441176  0.9006834
##      8.00  0.9055011  0.7458824  0.8961373
##     16.00  0.9086274  0.7494118  0.9003295
##     32.00  0.9095067  0.7558824  0.9010191
##     64.00  0.9071082  0.7635294  0.9020809
##    128.00  0.9068724  0.7647059  0.9097574
##    256.00  0.9078494  0.7900000  0.9153623
##    512.00  0.9064750  0.8000000  0.9241190
##   1024.00  0.9055970  0.8076471  0.9262029
##   2048.00  0.9030325  0.8164706  0.9286621
## 
## Tuning parameter 'sigma' was held constant at a value of 0.2133227
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were sigma = 0.2133227 and C = 32.
BAL_SVM_R_Tune$finalModel
## Support Vector Machine object of class "ksvm" 
## 
## SV type: C-svc  (classification) 
##  parameter : cost C = 32 
## 
## Gaussian Radial Basis kernel function. 
##  Hyperparameter : sigma =  0.213322651914923 
## 
## Number of Support Vectors : 356 
## 
## Objective Function Value : -6239.218 
## Training error : 0.08114 
## Probability model included.
BAL_SVM_R_Tune$results
##        sigma       C       ROC      Sens      Spec      ROCSD     SensSD
## 1  0.2133227    0.25 0.8785695 0.7035294 0.9108375 0.02404759 0.04721176
## 2  0.2133227    0.50 0.8841244 0.7217647 0.9090831 0.02300940 0.04949281
## 3  0.2133227    1.00 0.8901422 0.7335294 0.9048848 0.02180995 0.04479062
## 4  0.2133227    2.00 0.8994429 0.7364706 0.9052326 0.01913779 0.04531067
## 5  0.2133227    4.00 0.9053698 0.7441176 0.9006834 0.01706242 0.04950738
## 6  0.2133227    8.00 0.9055011 0.7458824 0.8961373 0.01610225 0.05328050
## 7  0.2133227   16.00 0.9086274 0.7494118 0.9003295 0.01630832 0.05259281
## 8  0.2133227   32.00 0.9095067 0.7558824 0.9010191 0.01839949 0.04950738
## 9  0.2133227   64.00 0.9071082 0.7635294 0.9020809 0.01952445 0.05128116
## 10 0.2133227  128.00 0.9068724 0.7647059 0.9097574 0.01936122 0.05469587
## 11 0.2133227  256.00 0.9078494 0.7900000 0.9153623 0.02066747 0.05259966
## 12 0.2133227  512.00 0.9064750 0.8000000 0.9241190 0.02410424 0.04746303
## 13 0.2133227 1024.00 0.9055970 0.8076471 0.9262029 0.02220728 0.05972351
## 14 0.2133227 2048.00 0.9030325 0.8164706 0.9286621 0.02363041 0.05729006
##        SpecSD
## 1  0.02975325
## 2  0.03050958
## 3  0.03145146
## 4  0.02573695
## 5  0.02473761
## 6  0.02506890
## 7  0.02219347
## 8  0.02905519
## 9  0.02835256
## 10 0.03106407
## 11 0.02789321
## 12 0.02662871
## 13 0.02696224
## 14 0.02368710
(BAL_SVM_R_Train_AUROC <- BAL_SVM_R_Tune$results[BAL_SVM_R_Tune$results$C==BAL_SVM_R_Tune$bestTune$C,
                                                       c("ROC")])
## [1] 0.9095067
##################################
# Identifying and plotting the
# best model predictors
##################################
# model does not support variable importance measurement

##################################
# Independently evaluating the model
# on the test set
##################################
BAL_SVM_R_Test <- data.frame(BAL_SVM_R_Test_Observed = MA_Test$diagnosis,
                             BAL_SVM_R_Test_Predicted = predict(BAL_SVM_R_Tune,
                                                                MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                                type = "prob"))

BAL_SVM_R_Test
##     BAL_SVM_R_Test_Observed BAL_SVM_R_Test_Predicted.M
## 1                         M               0.9044222829
## 2                         M               0.7290596096
## 3                         M               0.9233468355
## 4                         M               0.7404179815
## 5                         M               0.7945116889
## 6                         M               0.9170819657
## 7                         M               0.9156025332
## 8                         M               0.1669205011
## 9                         M               0.9758341031
## 10                        M               0.9647140381
## 11                        M               0.9932556308
## 12                        M               0.4508001411
## 13                        B               0.0003314662
## 14                        M               0.9021609771
## 15                        B               0.2047716701
## 16                        B               0.0707212058
## 17                        M               0.3236152237
## 18                        B               0.0187507336
## 19                        B               0.7446307535
## 20                        B               0.5427955060
## 21                        B               0.0068373114
## 22                        B               0.1947931307
## 23                        B               0.5253099013
## 24                        B               0.5044853061
## 25                        B               0.1970896749
## 26                        M               0.7689463221
## 27                        B               0.7914438541
## 28                        M               0.7290744455
## 29                        B               0.1631383158
## 30                        B               0.0761838734
## 31                        M               0.7290996044
## 32                        M               0.7349402279
## 33                        M               0.7290321252
## 34                        M               0.7290625416
## 35                        B               0.4316680111
## 36                        B               0.0306830496
## 37                        M               0.1523193270
## 38                        B               0.0087966314
## 39                        M               0.7251864555
## 40                        M               0.7290782595
## 41                        M               0.8598266814
## 42                        M               0.7290026567
## 43                        M               0.9938849286
## 44                        B               0.6063179288
## 45                        B               0.8035877278
## 46                        M               0.6242223973
## 47                        M               0.7290799020
## 48                        B               0.4563422587
## 49                        B               0.2047403876
## 50                        B               0.2047985340
## 51                        B               0.0265004852
## 52                        B               0.1745819827
## 53                        M               0.6781421929
## 54                        B               0.0098709363
## 55                        B               0.0114137185
## 56                        B               0.2264681105
## 57                        M               0.4824927787
## 58                        M               0.8670805402
## 59                        M               0.4355184055
## 60                        M               0.7289930039
## 61                        M               0.9591627888
## 62                        B               0.0052187035
## 63                        B               0.1841120773
## 64                        M               0.6889729612
## 65                        B               0.2138211654
## 66                        B               0.0824628982
## 67                        B               0.0318241383
## 68                        B               0.0083108275
## 69                        B               0.0018392697
## 70                        B               0.1504943078
## 71                        B               0.0405801266
## 72                        B               0.2869119087
## 73                        B               0.0462521616
## 74                        B               0.2216940163
## 75                        B               0.0674004043
## 76                        B               0.2047863777
## 77                        B               0.1316489487
## 78                        B               0.0221222591
## 79                        M               0.7824602486
## 80                        B               0.2048291380
## 81                        B               0.5441367870
## 82                        B               0.2048147283
## 83                        B               0.5165131506
## 84                        M               0.9132258558
## 85                        M               0.7291484662
## 86                        B               0.2048362023
## 87                        M               0.1806026856
## 88                        B               0.0218300552
## 89                        B               0.0270350012
## 90                        B               0.0320670836
## 91                        M               0.9589181012
## 92                        B               0.0169817243
## 93                        B               0.0265853072
## 94                        B               0.0865747861
## 95                        B               0.0084736509
## 96                        M               0.6498979581
## 97                        B               0.0013661124
## 98                        B               0.7197280965
## 99                        B               0.1097556022
## 100                       B               0.1616817543
## 101                       M               0.9315351853
## 102                       B               0.3177547602
## 103                       B               0.2047586547
## 104                       B               0.5262180182
## 105                       B               0.1875604282
## 106                       B               0.7716527296
## 107                       B               0.1445524662
## 108                       B               0.8269889251
## 109                       M               0.3266559858
## 110                       M               0.8787591167
## 111                       B               0.1809532465
## 112                       B               0.4755611353
## 113                       B               0.3888383226
## 114                       B               0.2843340190
## 115                       B               0.1473702752
## 116                       B               0.2047528091
## 117                       B               0.1067514334
## 118                       M               0.9479107742
## 119                       M               0.9773435443
## 120                       M               0.5754594849
## 121                       M               0.6015802796
## 122                       B               0.0649192935
## 123                       B               0.0213288903
## 124                       M               0.9742491322
## 125                       M               0.7290773094
## 126                       M               0.1669205011
## 127                       M               0.9932556308
## 128                       B               0.4614325728
## 129                       B               0.0275797282
## 130                       B               0.0522901184
## 131                       M               0.7290866363
## 132                       B               0.2047713327
## 133                       B               0.1584687810
## 134                       M               0.8507838033
## 135                       B               0.0068373114
## 136                       M               0.7290163932
## 137                       B               0.2047251065
## 138                       B               0.2047885799
## 139                       B               0.7914438541
## 140                       M               0.9674604705
## 141                       M               0.7289655089
## 142                       B               0.0814530866
## 143                       M               0.1674733888
## 144                       M               0.7633769432
## 145                       B               0.2047902442
## 146                       B               0.2047931519
## 147                       B               0.1856451887
## 148                       M               0.6585283243
## 149                       M               0.7291168244
## 150                       M               0.7704623651
## 151                       M               0.8547308285
## 152                       M               0.8598266814
## 153                       B               0.0993871425
## 154                       M               0.5397197215
## 155                       M               0.6242223973
## 156                       B               0.4563422587
## 157                       B               0.0265004852
## 158                       B               0.0059930669
## 159                       M               0.7890539791
## 160                       B               0.1823812336
## 161                       B               0.0763185137
## 162                       B               0.6154978701
## 163                       M               0.8133833089
## 164                       M               0.1518747402
## 165                       M               0.7995780848
## 166                       B               0.2138211654
## 167                       B               0.0007057461
## 168                       M               0.5019879468
## 169                       M               0.7290765956
## 170                       B               0.0151907106
## 171                       B               0.2047602499
## 172                       B               0.0018950016
## 173                       B               0.0297726297
## 174                       B               0.0346484595
## 175                       M               0.3196586472
## 176                       B               0.2047513452
## 177                       M               0.7719056636
## 178                       B               0.0245628151
## 179                       B               0.1397775263
## 180                       M               0.9219526232
## 181                       M               0.7290235658
## 182                       B               0.0405801266
## 183                       B               0.0881705707
## 184                       B               0.2869119087
## 185                       M               0.7691243509
## 186                       B               0.0016429278
## 187                       B               0.2047986697
## 188                       B               0.2910240994
## 189                       B               0.1316489487
## 190                       B               0.1988594824
## 191                       B               0.0054014539
## 192                       B               0.0026813409
## 193                       B               0.0354992607
## 194                       B               0.2047453695
## 195                       B               0.8763465750
## 196                       M               0.9189588302
## 197                       B               0.1960193136
## 198                       B               0.0928743590
## 199                       B               0.6866198710
## 200                       B               0.0218300552
## 201                       B               0.2047956979
## 202                       M               0.7291675790
## 203                       B               0.0329259941
## 204                       B               0.0133748316
## 205                       B               0.1984969871
## 206                       B               0.2047046925
## 207                       B               0.0555626842
## 208                       B               0.0041192726
## 209                       B               0.4214333492
## 210                       B               0.0239206631
## 211                       B               0.7197280965
## 212                       M               0.1521024957
## 213                       M               0.9315351853
## 214                       B               0.0344278876
## 215                       B               0.1387945330
## 216                       B               0.2047557364
## 217                       B               0.5262180182
## 218                       B               0.0012688405
## 219                       B               0.1231909083
## 220                       M               0.5289037277
## 221                       M               0.8787591167
## 222                       B               0.2048531346
## 223                       B               0.1809532465
## 224                       B               0.2048034409
## 225                       M               0.3484941721
## 226                       B               0.2048194280
##     BAL_SVM_R_Test_Predicted.B
## 1                  0.095577717
## 2                  0.270940390
## 3                  0.076653165
## 4                  0.259582018
## 5                  0.205488311
## 6                  0.082918034
## 7                  0.084397467
## 8                  0.833079499
## 9                  0.024165897
## 10                 0.035285962
## 11                 0.006744369
## 12                 0.549199859
## 13                 0.999668534
## 14                 0.097839023
## 15                 0.795228330
## 16                 0.929278794
## 17                 0.676384776
## 18                 0.981249266
## 19                 0.255369247
## 20                 0.457204494
## 21                 0.993162689
## 22                 0.805206869
## 23                 0.474690099
## 24                 0.495514694
## 25                 0.802910325
## 26                 0.231053678
## 27                 0.208556146
## 28                 0.270925554
## 29                 0.836861684
## 30                 0.923816127
## 31                 0.270900396
## 32                 0.265059772
## 33                 0.270967875
## 34                 0.270937458
## 35                 0.568331989
## 36                 0.969316950
## 37                 0.847680673
## 38                 0.991203369
## 39                 0.274813545
## 40                 0.270921741
## 41                 0.140173319
## 42                 0.270997343
## 43                 0.006115071
## 44                 0.393682071
## 45                 0.196412272
## 46                 0.375777603
## 47                 0.270920098
## 48                 0.543657741
## 49                 0.795259612
## 50                 0.795201466
## 51                 0.973499515
## 52                 0.825418017
## 53                 0.321857807
## 54                 0.990129064
## 55                 0.988586282
## 56                 0.773531890
## 57                 0.517507221
## 58                 0.132919460
## 59                 0.564481595
## 60                 0.271006996
## 61                 0.040837211
## 62                 0.994781297
## 63                 0.815887923
## 64                 0.311027039
## 65                 0.786178835
## 66                 0.917537102
## 67                 0.968175862
## 68                 0.991689173
## 69                 0.998160730
## 70                 0.849505692
## 71                 0.959419873
## 72                 0.713088091
## 73                 0.953747838
## 74                 0.778305984
## 75                 0.932599596
## 76                 0.795213622
## 77                 0.868351051
## 78                 0.977877741
## 79                 0.217539751
## 80                 0.795170862
## 81                 0.455863213
## 82                 0.795185272
## 83                 0.483486849
## 84                 0.086774144
## 85                 0.270851534
## 86                 0.795163798
## 87                 0.819397314
## 88                 0.978169945
## 89                 0.972964999
## 90                 0.967932916
## 91                 0.041081899
## 92                 0.983018276
## 93                 0.973414693
## 94                 0.913425214
## 95                 0.991526349
## 96                 0.350102042
## 97                 0.998633888
## 98                 0.280271903
## 99                 0.890244398
## 100                0.838318246
## 101                0.068464815
## 102                0.682245240
## 103                0.795241345
## 104                0.473781982
## 105                0.812439572
## 106                0.228347270
## 107                0.855447534
## 108                0.173011075
## 109                0.673344014
## 110                0.121240883
## 111                0.819046753
## 112                0.524438865
## 113                0.611161677
## 114                0.715665981
## 115                0.852629725
## 116                0.795247191
## 117                0.893248567
## 118                0.052089226
## 119                0.022656456
## 120                0.424540515
## 121                0.398419720
## 122                0.935080706
## 123                0.978671110
## 124                0.025750868
## 125                0.270922691
## 126                0.833079499
## 127                0.006744369
## 128                0.538567427
## 129                0.972420272
## 130                0.947709882
## 131                0.270913364
## 132                0.795228667
## 133                0.841531219
## 134                0.149216197
## 135                0.993162689
## 136                0.270983607
## 137                0.795274894
## 138                0.795211420
## 139                0.208556146
## 140                0.032539529
## 141                0.271034491
## 142                0.918546913
## 143                0.832526611
## 144                0.236623057
## 145                0.795209756
## 146                0.795206848
## 147                0.814354811
## 148                0.341471676
## 149                0.270883176
## 150                0.229537635
## 151                0.145269171
## 152                0.140173319
## 153                0.900612857
## 154                0.460280278
## 155                0.375777603
## 156                0.543657741
## 157                0.973499515
## 158                0.994006933
## 159                0.210946021
## 160                0.817618766
## 161                0.923681486
## 162                0.384502130
## 163                0.186616691
## 164                0.848125260
## 165                0.200421915
## 166                0.786178835
## 167                0.999294254
## 168                0.498012053
## 169                0.270923404
## 170                0.984809289
## 171                0.795239750
## 172                0.998104998
## 173                0.970227370
## 174                0.965351540
## 175                0.680341353
## 176                0.795248655
## 177                0.228094336
## 178                0.975437185
## 179                0.860222474
## 180                0.078047377
## 181                0.270976434
## 182                0.959419873
## 183                0.911829429
## 184                0.713088091
## 185                0.230875649
## 186                0.998357072
## 187                0.795201330
## 188                0.708975901
## 189                0.868351051
## 190                0.801140518
## 191                0.994598546
## 192                0.997318659
## 193                0.964500739
## 194                0.795254631
## 195                0.123653425
## 196                0.081041170
## 197                0.803980686
## 198                0.907125641
## 199                0.313380129
## 200                0.978169945
## 201                0.795204302
## 202                0.270832421
## 203                0.967074006
## 204                0.986625168
## 205                0.801503013
## 206                0.795295308
## 207                0.944437316
## 208                0.995880727
## 209                0.578566651
## 210                0.976079337
## 211                0.280271903
## 212                0.847897504
## 213                0.068464815
## 214                0.965572112
## 215                0.861205467
## 216                0.795244264
## 217                0.473781982
## 218                0.998731159
## 219                0.876809092
## 220                0.471096272
## 221                0.121240883
## 222                0.795146865
## 223                0.819046753
## 224                0.795196559
## 225                0.651505828
## 226                0.795180572
##################################
# Reporting the independent evaluation results
# for the test set
##################################
BAL_SVM_R_Test_ROC <- roc(response = BAL_SVM_R_Test$BAL_SVM_R_Test_Observed,
                          predictor = BAL_SVM_R_Test$BAL_SVM_R_Test_Predicted.M,
                          levels = rev(levels(BAL_SVM_R_Test$BAL_SVM_R_Test_Observed)))

(BAL_SVM_R_Test_AUROC <- auc(BAL_SVM_R_Test_ROC)[1])
## [1] 0.9159121

1.7.4 Base Learner Model Development using K-Nearest Neighbors (BAL_KNN)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
KNN_Grid = data.frame(k = 1:15)

##################################
# Running the k-nearest neighbors model
# by setting the caret method to 'knn'
##################################
set.seed(12345678)
BAL_KNN_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                      y = MA_Train$diagnosis,
                      method = "knn",
                      preProc = c("center", "scale"),
                      tuneGrid = KNN_Grid,
                      metric = "ROC",
                      trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
BAL_KNN_Tune
## k-Nearest Neighbors 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   k   ROC        Sens       Spec     
##    1  0.8999215  0.8841176  0.9157254
##    2  0.8810826  0.7476471  0.8335652
##    3  0.8881622  0.7223529  0.8440793
##    4  0.8843389  0.7264706  0.8751670
##    5  0.8801288  0.7294118  0.8877712
##    6  0.8822927  0.7111765  0.8800671
##    7  0.8826117  0.7105882  0.8828680
##    8  0.8824661  0.7176471  0.8814737
##    9  0.8822659  0.7141176  0.8811106
##   10  0.8835703  0.6958824  0.8779588
##   11  0.8866311  0.7100000  0.8793593
##   12  0.8871688  0.7094118  0.8790175
##   13  0.8888748  0.7094118  0.8776171
##   14  0.8887794  0.7105882  0.8797437
##   15  0.8901884  0.7082353  0.8793715
## 
## ROC was used to select the optimal model using the largest value.
## The final value used for the model was k = 1.
BAL_KNN_Tune$finalModel
## 1-nearest neighbor model
## Training set outcome distribution:
## 
##   M   B 
## 340 572
BAL_KNN_Tune$results
##     k       ROC      Sens      Spec      ROCSD     SensSD     SpecSD
## 1   1 0.8999215 0.8841176 0.9157254 0.02896451 0.05993436 0.02526390
## 2   2 0.8810826 0.7476471 0.8335652 0.02389197 0.06191043 0.03572945
## 3   3 0.8881622 0.7223529 0.8440793 0.02482202 0.04827623 0.02869459
## 4   4 0.8843389 0.7264706 0.8751670 0.02288013 0.05453087 0.02753525
## 5   5 0.8801288 0.7294118 0.8877712 0.02422588 0.05076548 0.03244839
## 6   6 0.8822927 0.7111765 0.8800671 0.02770719 0.05434548 0.03295881
## 7   7 0.8826117 0.7105882 0.8828680 0.02576758 0.05497850 0.03623026
## 8   8 0.8824661 0.7176471 0.8814737 0.02787252 0.04611492 0.03152335
## 9   9 0.8822659 0.7141176 0.8811106 0.02617793 0.04494326 0.03184781
## 10 10 0.8835703 0.6958824 0.8779588 0.02477549 0.04131629 0.03514350
## 11 11 0.8866311 0.7100000 0.8793593 0.02563095 0.04213691 0.03419137
## 12 12 0.8871688 0.7094118 0.8790175 0.02675550 0.03707710 0.03037541
## 13 13 0.8888748 0.7094118 0.8776171 0.02545936 0.03756003 0.03167993
## 14 14 0.8887794 0.7105882 0.8797437 0.02563345 0.03976031 0.03542890
## 15 15 0.8901884 0.7082353 0.8793715 0.02412346 0.04259632 0.03291041
(BAL_KNN_Train_AUROC <- BAL_KNN_Tune$results[BAL_KNN_Tune$results$k==BAL_KNN_Tune$bestTune$k,
                                                   c("ROC")])
## [1] 0.8999215
##################################
# Identifying and plotting the
# best model predictors
##################################
# model does not support variable importance measurement

##################################
# Independently evaluating the model
# on the test set
##################################
BAL_KNN_Test <- data.frame(BAL_KNN_Test_Observed = MA_Test$diagnosis,
                           BAL_KNN_Test_Predicted = predict(BAL_KNN_Tune,
                                                            MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                            type = "prob"))

BAL_KNN_Test
##     BAL_KNN_Test_Observed BAL_KNN_Test_Predicted.M BAL_KNN_Test_Predicted.B
## 1                       M                        1                        0
## 2                       M                        1                        0
## 3                       M                        1                        0
## 4                       M                        1                        0
## 5                       M                        1                        0
## 6                       M                        1                        0
## 7                       M                        1                        0
## 8                       M                        1                        0
## 9                       M                        1                        0
## 10                      M                        1                        0
## 11                      M                        1                        0
## 12                      M                        1                        0
## 13                      B                        0                        1
## 14                      M                        1                        0
## 15                      B                        0                        1
## 16                      B                        0                        1
## 17                      M                        1                        0
## 18                      B                        0                        1
## 19                      B                        0                        1
## 20                      B                        0                        1
## 21                      B                        0                        1
## 22                      B                        0                        1
## 23                      B                        0                        1
## 24                      B                        0                        1
## 25                      B                        0                        1
## 26                      M                        1                        0
## 27                      B                        1                        0
## 28                      M                        1                        0
## 29                      B                        0                        1
## 30                      B                        0                        1
## 31                      M                        1                        0
## 32                      M                        1                        0
## 33                      M                        1                        0
## 34                      M                        1                        0
## 35                      B                        0                        1
## 36                      B                        0                        1
## 37                      M                        1                        0
## 38                      B                        0                        1
## 39                      M                        1                        0
## 40                      M                        1                        0
## 41                      M                        1                        0
## 42                      M                        1                        0
## 43                      M                        1                        0
## 44                      B                        0                        1
## 45                      B                        0                        1
## 46                      M                        1                        0
## 47                      M                        1                        0
## 48                      B                        1                        0
## 49                      B                        0                        1
## 50                      B                        0                        1
## 51                      B                        0                        1
## 52                      B                        0                        1
## 53                      M                        1                        0
## 54                      B                        0                        1
## 55                      B                        0                        1
## 56                      B                        0                        1
## 57                      M                        1                        0
## 58                      M                        1                        0
## 59                      M                        1                        0
## 60                      M                        1                        0
## 61                      M                        1                        0
## 62                      B                        0                        1
## 63                      B                        0                        1
## 64                      M                        1                        0
## 65                      B                        0                        1
## 66                      B                        0                        1
## 67                      B                        0                        1
## 68                      B                        0                        1
## 69                      B                        0                        1
## 70                      B                        0                        1
## 71                      B                        0                        1
## 72                      B                        0                        1
## 73                      B                        0                        1
## 74                      B                        0                        1
## 75                      B                        0                        1
## 76                      B                        0                        1
## 77                      B                        0                        1
## 78                      B                        0                        1
## 79                      M                        1                        0
## 80                      B                        0                        1
## 81                      B                        0                        1
## 82                      B                        0                        1
## 83                      B                        0                        1
## 84                      M                        1                        0
## 85                      M                        1                        0
## 86                      B                        0                        1
## 87                      M                        1                        0
## 88                      B                        0                        1
## 89                      B                        0                        1
## 90                      B                        0                        1
## 91                      M                        1                        0
## 92                      B                        0                        1
## 93                      B                        0                        1
## 94                      B                        0                        1
## 95                      B                        0                        1
## 96                      M                        1                        0
## 97                      B                        0                        1
## 98                      B                        1                        0
## 99                      B                        0                        1
## 100                     B                        0                        1
## 101                     M                        1                        0
## 102                     B                        0                        1
## 103                     B                        0                        1
## 104                     B                        1                        0
## 105                     B                        0                        1
## 106                     B                        0                        1
## 107                     B                        0                        1
## 108                     B                        0                        1
## 109                     M                        1                        0
## 110                     M                        1                        0
## 111                     B                        0                        1
## 112                     B                        0                        1
## 113                     B                        0                        1
## 114                     B                        0                        1
## 115                     B                        0                        1
## 116                     B                        0                        1
## 117                     B                        0                        1
## 118                     M                        1                        0
## 119                     M                        1                        0
## 120                     M                        1                        0
## 121                     M                        1                        0
## 122                     B                        0                        1
## 123                     B                        0                        1
## 124                     M                        1                        0
## 125                     M                        1                        0
## 126                     M                        1                        0
## 127                     M                        1                        0
## 128                     B                        0                        1
## 129                     B                        0                        1
## 130                     B                        0                        1
## 131                     M                        1                        0
## 132                     B                        0                        1
## 133                     B                        0                        1
## 134                     M                        1                        0
## 135                     B                        0                        1
## 136                     M                        1                        0
## 137                     B                        0                        1
## 138                     B                        0                        1
## 139                     B                        1                        0
## 140                     M                        1                        0
## 141                     M                        1                        0
## 142                     B                        0                        1
## 143                     M                        1                        0
## 144                     M                        1                        0
## 145                     B                        0                        1
## 146                     B                        0                        1
## 147                     B                        0                        1
## 148                     M                        1                        0
## 149                     M                        1                        0
## 150                     M                        1                        0
## 151                     M                        1                        0
## 152                     M                        1                        0
## 153                     B                        0                        1
## 154                     M                        1                        0
## 155                     M                        1                        0
## 156                     B                        1                        0
## 157                     B                        0                        1
## 158                     B                        0                        1
## 159                     M                        1                        0
## 160                     B                        0                        1
## 161                     B                        0                        1
## 162                     B                        0                        1
## 163                     M                        1                        0
## 164                     M                        1                        0
## 165                     M                        1                        0
## 166                     B                        0                        1
## 167                     B                        0                        1
## 168                     M                        1                        0
## 169                     M                        1                        0
## 170                     B                        0                        1
## 171                     B                        0                        1
## 172                     B                        0                        1
## 173                     B                        0                        1
## 174                     B                        0                        1
## 175                     M                        1                        0
## 176                     B                        0                        1
## 177                     M                        1                        0
## 178                     B                        0                        1
## 179                     B                        0                        1
## 180                     M                        1                        0
## 181                     M                        1                        0
## 182                     B                        0                        1
## 183                     B                        0                        1
## 184                     B                        0                        1
## 185                     M                        1                        0
## 186                     B                        0                        1
## 187                     B                        0                        1
## 188                     B                        0                        1
## 189                     B                        0                        1
## 190                     B                        0                        1
## 191                     B                        0                        1
## 192                     B                        0                        1
## 193                     B                        0                        1
## 194                     B                        0                        1
## 195                     B                        0                        1
## 196                     M                        1                        0
## 197                     B                        0                        1
## 198                     B                        0                        1
## 199                     B                        0                        1
## 200                     B                        0                        1
## 201                     B                        0                        1
## 202                     M                        1                        0
## 203                     B                        0                        1
## 204                     B                        0                        1
## 205                     B                        0                        1
## 206                     B                        0                        1
## 207                     B                        0                        1
## 208                     B                        0                        1
## 209                     B                        0                        1
## 210                     B                        0                        1
## 211                     B                        1                        0
## 212                     M                        1                        0
## 213                     M                        1                        0
## 214                     B                        0                        1
## 215                     B                        0                        1
## 216                     B                        0                        1
## 217                     B                        1                        0
## 218                     B                        0                        1
## 219                     B                        0                        1
## 220                     M                        1                        0
## 221                     M                        1                        0
## 222                     B                        0                        1
## 223                     B                        0                        1
## 224                     B                        0                        1
## 225                     M                        1                        0
## 226                     B                        0                        1
##################################
# Reporting the independent evaluation results
# for the test set
##################################
BAL_KNN_Test_ROC <- roc(response = BAL_KNN_Test$BAL_KNN_Test_Observed,
                        predictor = BAL_KNN_Test$BAL_KNN_Test_Predicted.M,
                        levels = rev(levels(BAL_KNN_Test$BAL_KNN_Test_Observed)))

(BAL_KNN_Test_AUROC <- auc(BAL_KNN_Test_ROC)[1])
## [1] 0.971831

1.7.5 Base Learner Model Development using Naive Bayes (BAL_NB)


Details.

Code Chunk | Output
##################################
# Setting the conditions
# for hyperparameter tuning
##################################
NB_Grid = data.frame(usekernel = c(TRUE, FALSE), 
                     fL = 2, 
                     adjust = FALSE)

##################################
# Running the naive bayes model
# by setting the caret method to 'nb'
##################################
set.seed(12345678)
BAL_NB_Tune <- train(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                     y = MA_Train$diagnosis,
                     method = "nb",
                     tuneGrid = NB_Grid,
                     metric = "ROC",
                     trControl = RKFold_Control)

##################################
# Reporting the cross-validation results
# for the train set
##################################
BAL_NB_Tune
## Naive Bayes 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 729, 729, 730, 730, 730, 730, ... 
## Resampling results across tuning parameters:
## 
##   usekernel  ROC        Sens       Spec     
##   FALSE      0.8864212  0.7576471  0.8643356
##    TRUE            NaN        NaN        NaN
## 
## Tuning parameter 'fL' was held constant at a value of 2
## Tuning
##  parameter 'adjust' was held constant at a value of FALSE
## ROC was used to select the optimal model using the largest value.
## The final values used for the model were fL = 2, usekernel = FALSE and adjust
##  = FALSE.
BAL_NB_Tune$finalModel
## $apriori
## grouping
##        M        B 
## 0.372807 0.627193 
## 
## $tables
## $tables$texture_mean
##       [,1]      [,2]
## M 3.053171 0.1768317
## B 2.861218 0.2118987
## 
## $tables$smoothness_mean
##        [,1]      [,2]
## M -2.283184 0.1220104
## B -2.392952 0.1412754
## 
## $tables$compactness_se
##        [,1]      [,2]
## M -3.579629 0.5293692
## B -4.063368 0.6656621
## 
## $tables$texture_worst
##       [,1]      [,2]
## M 4.790170 0.3648083
## B 4.360104 0.4212065
## 
## $tables$smoothness_worst
##        [,1]       [,2]
## M -1.471766 0.08336974
## B -1.553719 0.08524520
## 
## $tables$symmetry_worst
##        [,1]      [,2]
## M -1.588074 0.4012040
## B -1.881098 0.3039037
## 
## 
## $levels
## [1] "M" "B"
## 
## $call
## NaiveBayes.default(x = x, grouping = y, usekernel = FALSE, fL = param$fL)
## 
## $x
##       texture_mean smoothness_mean compactness_se texture_worst
## X1        2.339881       -2.133687      -3.015119      3.845649
## X2        2.877512       -2.468168      -4.336671      4.393994
## X3        3.056357       -2.210918      -3.217377      4.558289
## X4        3.014554       -1.948413      -2.595883      4.629842
## X5        2.663053       -2.299590      -3.704602      3.777223
## X6        2.753661       -2.057289      -3.397703      4.421124
## X7        2.994732       -2.357781      -4.281638      4.712710
## X9        3.082827       -2.061209      -3.351836      4.919334
## X10       3.179719       -2.131999      -2.628731      5.491708
## X11       3.145875       -2.500305      -4.681080      5.114832
## X12       2.884242       -2.332014      -3.203741      4.685875
## X14       3.175968       -2.476819      -3.465416      4.712710
## X15       3.118392       -2.179483      -2.824135      5.000625
## X17       3.002211       -2.315974      -4.455028      4.928999
## X18       3.029167       -2.145581      -3.688480      4.967287
## X20       2.664447       -2.324933      -4.226734      4.034440
## X21       2.754297       -2.230264      -3.964369      4.146994
## X22       2.520917       -2.278869      -4.246098      3.668189
## X23       2.657458       -2.232127      -2.932194      4.017490
## X25       3.062456       -2.188364      -3.972835      4.972347
## X26       2.797281       -2.131999      -3.270432      4.226835
## X27       3.069447       -2.249993      -3.488391      5.074506
## X28       3.008155       -2.360214      -3.603803      4.684455
## X29       3.229618       -2.223774      -3.487736      5.278432
## X30       2.711378       -2.318003      -3.495618      4.058702
## X32       2.928524       -2.199126      -3.377286      4.744803
## X33       3.177220       -2.122767      -3.479591      5.005619
## X34       3.276012       -2.364354      -3.405808      4.930285
## X35       2.883683       -2.263364      -3.551555      4.684455
## X36       3.072230       -2.342366      -3.689280      4.806397
## X38       2.913437       -2.409836      -5.318724      4.345339
## X39       3.226844       -2.365844      -4.515329      4.533450
## X40       3.035914       -2.286712      -3.799141      4.594701
## X41       3.071767       -2.505681      -4.508043      4.888151
## X43       3.211247       -2.398986      -2.296603      5.072078
## X45       3.082369       -2.331602      -4.280192      4.864503
## X47       2.823757       -2.453408      -4.106822      4.274627
## X49       2.683074       -2.272056      -4.249596      4.165667
## X50       3.104587       -2.435888      -4.281638      4.988725
## X51       3.072693       -2.449115      -4.629668      4.572474
## X52       2.793616       -2.565900      -4.442201      4.376271
## X53       2.903617       -2.493625      -4.781907      4.220791
## X55       3.091951       -2.401743      -4.575611      4.980549
## X56       2.931194       -2.351355      -4.741907      4.317312
## X57       2.921547       -2.250942      -3.769656      4.746189
## X58       3.072230       -2.174192      -3.556098      4.917397
## X59       2.960623       -2.518257      -4.756807      4.298995
## X60       2.467252       -2.327698      -4.551629      3.639212
## X62       3.043570       -2.085057      -3.453965      4.668773
## X63       3.097837       -2.254748      -2.651292      4.839292
## X64       2.629007       -2.561226      -3.234497      4.031624
## X66       3.175551       -2.143873      -3.767923      5.085404
## X67       3.044999       -2.259526      -4.042701      4.972347
## X68       2.946542       -2.508503      -4.686814      4.428254
## X73       3.199489       -2.233992      -2.879551      5.111247
## X74       2.759377       -2.295609      -3.880040      4.179793
## X75       2.804572       -2.389015      -4.006883      4.377888
## X76       2.978077       -2.389451      -3.815350      4.484527
## X77       2.392426       -2.047168      -3.561718      3.284809
## X78       2.781920       -2.239610      -2.840611      4.001364
## X79       3.176803       -2.051048      -2.683114      4.982438
## X80       2.890372       -2.309207      -4.097750      4.504524
## X81       3.043093       -2.205458      -4.071019      5.009980
## X83       3.215269       -2.241490      -2.865933      5.099260
## X84       3.269189       -2.107841      -2.805112      5.044600
## X85       2.750471       -2.330676      -4.010739      4.510643
## X86       2.918851       -2.315265      -4.105001      4.714115
## X87       3.066191       -2.359791      -3.512241      4.821893
## X88       3.202340       -2.404729      -3.994318      4.898589
## X90       2.723924       -2.178599      -3.120842      3.936655
## X91       3.178887       -2.410839      -4.007433      4.812472
## X92       3.125005       -2.385967      -3.711534      4.581390
## X94       2.906901       -2.280824      -4.205723      4.588794
## X95       2.987196       -2.264326      -3.292792      4.458901
## X96       3.136798       -2.399316      -3.357563      4.974243
## X97       2.881443       -2.258568      -4.440504      4.185067
## X99       2.552565       -2.409836      -4.319991      3.828226
## X100      2.984166       -2.327698      -3.542185      4.927712
## X101      3.218076       -2.355142      -4.207737      5.196499
## X102      2.597491       -2.145581      -4.524512      4.060557
## X105      2.959587       -2.303686      -3.808114      4.385955
## X106      2.744704       -1.967542      -3.536330      4.311499
## X108      2.919931       -2.467814      -4.559241      4.700742
## X110      3.056827       -2.435088      -4.162409      4.815168
## X111      2.832625       -2.266253      -3.529485      4.232863
## X112      3.033028       -2.309308      -3.208431      4.553792
## X113      2.978077       -2.546314      -2.597493      4.419537
## X114      3.005187       -2.187472      -3.284215      4.340417
## X115      2.761907       -2.162823      -3.810821      4.067964
## X117      2.757475       -2.357886      -2.694147      3.818947
## X118      2.813611       -2.152442      -3.663992      4.692258
## X119      3.131573       -2.158485      -3.224894      4.904441
## X120      2.996232       -2.476700      -4.776908      4.724620
## X121      2.381396       -2.367337      -4.180556      3.702239
## X122      2.840247       -2.249993      -3.862757      4.510643
## X124      2.387845       -2.206366      -4.361440      3.703328
## X126      2.845491       -2.432124      -4.691927      4.407598
## X127      3.206398       -2.379682      -4.444753      5.217803
## X128      2.939691       -2.498965      -3.600502      4.573218
## X131      2.587012       -2.238672      -3.706636      3.894116
## X132      2.969388       -2.214574      -4.210429      4.593226
## X133      3.069912       -2.294617      -3.936316      4.979920
## X134      2.634045       -2.357886      -4.189755      4.033502
## X135      3.086943       -2.361274      -4.253106      4.961581
## X137      2.813611       -2.252843      -4.283087      4.554542
## X138      2.733718       -2.339353      -4.185802      4.279690
## X140      2.594508       -2.150723      -3.352979      3.680332
## X141      2.482404       -2.380547      -5.175038      3.488165
## X142      2.893146       -2.330882      -3.943514      4.538741
## X143      2.851284       -2.214574      -4.054163      4.648665
## X144      2.767576       -2.444494      -4.158563      4.262772
## X145      2.706048       -2.551944      -4.027995      4.167437
## X146      2.684440       -2.161086      -3.048922      3.760309
## X148      2.932260       -2.508626      -3.020640      4.553792
## X151      3.033991       -2.175952      -4.472389      4.449513
## X152      3.030134       -2.363929      -2.915813      4.853256
## X153      2.730464       -2.233059      -2.344866      4.055916
## X154      2.571084       -2.327493      -4.712199      3.737909
## X155      2.730464       -2.366164      -3.903559      4.147887
## X156      2.887033       -2.447149      -4.159203      4.534963
## X157      3.032064       -2.193731      -3.004975      4.526631
## X158      2.968361       -2.597628      -3.626468      4.741335
## X159      2.544747       -2.373974      -4.626496      3.953251
## X160      2.561868       -2.588269      -5.093908      3.932732
## X161      3.004692       -2.217325      -3.727620      4.608673
## X163      2.898671       -2.189256      -3.782311      4.621834
## X164      3.100993       -2.290657      -3.449863      4.783310
## X165      3.092859       -2.472306      -3.671433      4.751724
## X166      2.983660       -2.474442      -4.830441      4.579906
## X168      2.933857       -2.423059      -3.700952      4.615263
## X169      3.205993       -2.254748      -3.314836      5.020540
## X170      2.830268       -2.317191      -4.290359      4.360856
## X171      2.516890       -2.274970      -4.439656      3.665973
## X172      2.977059       -2.402626      -4.544075      4.863182
## X173      2.475698       -2.073857      -3.763172      3.815845
## X174      2.688528       -2.296603      -3.853283      3.792962
## X175      2.718001       -2.431328      -4.588313      4.028805
## X176      2.670694       -2.392729      -4.827439      3.815845
## X177      2.893700       -2.333147      -2.429510      4.471360
## X178      3.001217       -2.319630      -3.195648      4.767568
## X179      3.100993       -2.772429      -6.095937      4.806397
## X180      2.569554       -2.437374      -5.057098      3.721768
## X181      3.085116       -2.212744      -3.674188      5.052569
## X182      3.279783       -2.170680      -3.045133      5.090835
## X183      3.011113       -2.343720      -3.788479      5.050733
## X184      2.702703       -2.401411      -3.289298      3.883102
## X186      2.715357       -2.378710      -4.422849      4.207786
## X187      2.922086       -2.454804      -4.690619      4.619646
## X188      2.844328       -2.325444      -4.743973      4.225973
## X189      2.855895       -2.295609      -4.735735      4.628388
## X190      2.766319       -2.515778      -3.811273      4.065190
## X192      3.063858       -2.436231      -4.022955      4.401206
## X193      2.902520       -2.666429      -5.000289      4.177151
## X195      3.144583       -2.259526      -2.971625      4.721123
## X196      2.793004       -2.533131      -4.143325      4.278004
## X198      3.083743       -2.607617      -2.938218      4.495315
## X199      3.113071       -2.462402      -3.297378      5.003747
## X202      2.961141       -2.411508      -3.661653      4.581390
## X203      3.283539       -2.170680      -2.971820      5.042143
## X204      3.167583       -2.022683      -3.471191      5.551376
## X205      2.923162       -2.306091      -3.957544      4.490698
## X206      2.814210       -2.421819      -3.953366      4.124564
## X207      2.848971       -2.217325      -4.382827      4.378696
## X208      3.008648       -2.433605      -4.197707      4.522074
## X210      2.558002       -2.503234      -4.393290      3.696783
## X212      2.941276       -2.422383      -4.058784      4.517508
## X214      3.241029       -2.296603      -2.458654      4.741335
## X215      3.170106       -2.357781      -3.294138      5.172099
## X216      2.829087       -2.276917      -3.370280      4.660893
## X218      2.861057       -2.519001      -3.467337      4.477566
## X219      3.070840       -2.366271      -3.451754      4.780580
## X220      3.480317       -2.474560      -3.632877      5.725074
## X224      3.008155       -2.277892      -3.740594      4.890764
## X225      2.834389       -2.471596      -4.506230      4.409194
## X226      2.600465       -2.312030      -4.248895      3.801311
## X227      2.738256       -2.250942      -4.820718      4.084542
## X229      3.176803       -2.537928      -3.774873      4.956498
## X230      3.105931       -2.218244      -3.255021      4.881604
## X231      2.948641       -2.170680      -3.908031      4.509879
## X232      3.298795       -2.676116      -4.106215      5.107058
## X233      3.520757       -2.553614      -4.988923      5.547844
## X234      3.325396       -2.390433      -3.881494      5.315680
## X235      2.766948       -2.469348      -4.565949      4.025039
## X236      3.056357       -2.400198      -4.280915      4.890111
## X238      3.066191       -2.482310      -3.463179      4.605738
## X240      3.670715       -2.321564      -3.580922      5.699444
## X241      2.747271       -2.362017      -4.383628      4.014653
## X243      2.900872       -2.344241      -2.827848      4.733688
## X244      3.168424       -2.520368      -3.624216      4.618186
## X245      3.157000       -2.275943      -3.425900      4.906389
## X247      2.858193       -2.629008      -4.154732      4.723921
## X248      2.646884       -2.434974      -2.883833      3.883102
## X249      3.227637       -2.337487      -4.570769      5.191870
## X250      2.703373       -2.289669      -4.399783      4.208655
## X251      3.159550       -2.295609      -3.242144      4.665910
## X252      2.915064       -2.370329      -4.542195      4.316482
## X253      2.986692       -2.242431      -2.984397      4.562778
## X254      2.837908       -2.294617      -4.197707      4.525113
## X255      2.961658       -2.268184      -4.017384      4.485299
## X257      3.359333       -2.379466      -3.043873      5.253674
## X258      2.848971       -2.013654      -3.081726      4.333015
## X260      3.513335       -2.241490      -3.666727      5.913428
## X261      3.298057       -2.302585      -4.149012      5.412105
## X264      2.964242       -2.545931      -4.698932      4.979289
## X265      3.094219       -2.330367      -4.417861      4.827259
## X267      2.941804       -2.334282      -3.329528      4.355967
## X268      3.083743       -2.531244      -3.727205      4.874383
## X269      2.785628       -2.361804      -3.826763      4.412381
## X270      3.015045       -2.223774      -3.039684      4.534207
## X271      2.822569       -2.744351      -5.596723      4.161235
## X273      3.044046       -2.364354      -3.003764      4.748958
## X276      2.854169       -2.099644      -4.207065      4.008967
## X277      2.650421       -2.366697      -4.710753      4.008967
## X278      2.994732       -2.416538      -4.497213      4.464360
## X279      2.881443       -2.532250      -5.244966      4.600594
## X280      2.719979       -2.352196      -4.172739      4.255969
## X281      3.280911       -2.282782      -3.709490      5.232668
## X282      2.640485       -2.549381      -4.239139      3.938613
## X283      2.900322       -2.266253      -3.817167      4.781263
## X284      2.932260       -2.238672      -3.357851      4.525113
## X286      2.912351       -2.477772      -4.827314      4.367359
## X287      3.033028       -2.452827      -2.965009      4.686585
## X288      2.574138       -2.665709      -4.308776      3.654863
## X289      2.993730       -2.523232      -2.493503      4.305672
## X290      2.938633       -2.440354      -4.272276      4.603535
## X291      2.982140       -2.435317      -2.240550      4.288943
## X292      2.949688       -2.408835      -3.856115      4.607206
## X293      2.773838       -2.297598      -3.910524      4.096440
## X294      2.859913       -2.480277      -4.199705      4.574706
## X295      2.623218       -2.336452      -4.439656      3.860909
## X296      2.585506       -2.386184      -4.809369      3.804433
## X297      2.513656       -2.462989      -4.405500      3.573135
## X298      2.898119       -2.305790      -4.600183      4.392389
## X299      2.899772       -2.721744      -4.285263      4.537986
## X301      2.939162       -2.162823      -3.441082      4.610871
## X302      2.990217       -2.470885      -3.390554      4.366547
## X303      3.172203       -2.225624      -3.050822      4.833951
## X304      2.923699       -2.236797      -4.671096      4.482982
## X305      2.899221       -2.424414      -3.629856      4.244873
## X306      3.198265       -2.593740      -3.756302      4.976136
## X307      2.761275       -2.463811      -4.845841      4.143420
## X309      2.542389       -2.606939      -5.587067      3.805473
## X310      2.627563       -2.482669      -5.357855      3.852784
## X311      2.950212       -2.428829      -4.698383      4.633474
## X312      2.753024       -2.574656      -5.112502      4.256821
## X313      2.593013       -2.431101      -3.587045      3.748604
## X314      2.372111       -2.453757      -4.312501      3.334618
## X315      2.923162       -2.231195      -4.266557      4.314822
## X316      2.824351       -2.463811      -5.805151      4.076268
## X317      2.644755       -2.559544      -5.155603      3.756060
## X318      2.937573       -2.328313      -4.098955      4.518270
## X319      2.939162       -2.305790      -2.719617      4.393191
## X320      2.833213       -2.582696      -4.529135      4.121857
## X321      2.783776       -2.243373      -3.154728      4.157683
## X322      2.978586       -2.523232      -4.285989      4.363297
## X323      2.589267       -2.176834      -3.904055      4.199074
## X324      3.068518       -2.145581      -3.716867      4.991235
## X325      2.721953       -2.444955      -4.255923      4.225110
## X326      2.850707       -2.274970      -4.654991      4.200819
## X327      2.555676       -2.374189      -4.481184      3.913012
## X329      3.030617       -2.146436      -3.871361      4.896635
## X330      3.085573       -2.149864      -3.281816      4.534207
## X331      2.741485       -2.354826      -3.428055      4.276316
## X332      2.962692       -2.345597      -3.424978      4.273783
## X333      2.988708       -2.249993      -4.506230      4.576936
## X334      2.693275       -2.488192      -4.944286      4.283059
## X335      2.945491       -2.487350      -4.775721      4.768255
## X336      3.044522       -2.190150      -4.019052      5.070864
## X337      2.655352       -2.357886      -3.770090      3.802352
## X338      3.064792       -2.395139      -3.244963      5.143922
## X339      2.863914       -2.295609      -4.234297      4.654427
## X340      3.189241       -2.235861      -3.926629      4.919334
## X341      2.805782       -2.327800      -3.489045      4.236301
## X342      2.823757       -2.467342      -3.360727      4.366547
## X343      2.705380       -2.270118      -3.975495      4.093700
## X344      3.076390       -2.323094      -3.390851      5.160983
## X346      2.688528       -2.314455      -3.063797      4.054986
## X347      2.939162       -2.478607      -4.513503      4.670202
## X348      2.690565       -2.421932      -4.195713      3.906069
## X350      2.705380       -2.155891      -3.722643      3.885109
## X351      2.837323       -2.582167      -4.963132      4.079030
## X352      2.955951       -2.085057      -2.724332      4.454212
## X353      2.859913       -2.163693      -3.159900      4.407598
## X354      3.248046       -2.278869      -3.716867      5.075113
## X355      2.644045       -2.620864      -3.385226      3.685830
## X357      2.922624       -2.223774      -3.236022      4.506820
## X358      2.785628       -2.436917      -4.835968      4.562030
## X359      2.740195       -2.489758      -3.540804      3.883102
## X360      2.907993       -2.293625      -4.712533      4.519792
## X361      2.894253       -2.598837      -5.361683      4.190330
## X362      3.071303       -2.455503      -3.889772      4.818532
## X364      2.906354       -2.334489      -4.014610      4.552291
## X366      3.080992       -2.391416      -3.960163      4.620376
## X367      3.289521       -2.312131      -3.063155      5.110649
## X369      2.847812       -2.366164      -4.499010      4.625478
## X370      3.086487       -2.241490      -3.568079      4.578422
## X371      3.148024       -2.328724      -3.239844      4.938626
## X372      2.580974       -2.530364      -4.209755      3.675924
## X373      2.714695       -2.301586      -3.621221      4.264469
## X374      2.853593       -2.359579      -3.965951      4.374653
## X375      2.776954       -2.488674      -4.143325      4.121857
## X376      2.776954       -2.314658      -4.010739      4.023154
## X377      3.006672       -2.399867      -2.571380      4.346158
## X378      3.339677       -2.588003      -4.420352      5.217230
## X379      2.718001       -2.492778      -3.511906      4.069812
## X380      2.935451       -2.107018      -3.090263      5.050733
## X381      2.561868       -2.089896      -4.030244      4.150563
## X384      2.861057       -2.261443      -3.295487      4.371414
## X385      2.618855       -2.481353      -3.876173      3.849729
## X386      3.148024       -2.443918      -4.050136      4.981809
## X389      2.740840       -2.481114      -2.707700      4.003267
## X390      3.144583       -2.292635      -3.194915      4.900541
## X391      2.503074       -2.302985      -4.294016      3.667081
## X392      2.823757       -2.264326      -3.929169      4.344519
## X393      2.994231       -2.154165      -3.530851      4.832614
## X394      3.103689       -2.148149      -3.289835      4.787400
## X395      2.874694       -2.273998      -4.115977      4.578422
## X396      2.843746       -2.520119      -4.363794      4.544020
## X397      2.938633       -2.245260      -3.933757      4.680188
## X398      2.859913       -2.520244      -3.359000      4.197328
## X399      2.696652       -2.558639      -4.220588      4.134460
## X400      2.848392       -2.398325      -3.930187      4.479114
## X402      2.389680       -2.422270      -4.419521      4.115529
## X404      2.783158       -2.314759      -4.354411      4.362484
## X405      2.704711       -2.443918      -4.768748      3.796097
## X406      2.922624       -2.298593      -3.847172      4.562030
## X407      2.698673       -2.354405      -4.385232      4.064264
## X408      3.061988       -2.583490      -3.105547      4.666626
## X409      3.028199       -2.267218      -3.585601      4.549287
## X410      2.885917       -2.443573      -3.767923      4.796917
## X411      2.866193       -2.423849      -4.610484      5.256500
## X412      2.823163       -2.228406      -4.671844      4.625478
## X413      3.076390       -2.529611      -3.621595      4.735776
## X414      3.096030       -2.463341      -3.572698      4.971715
## X415      3.394844       -2.486508      -4.249596      5.289608
## X416      3.052585       -2.325547      -3.489045      4.680900
## X418      3.048325       -2.189256      -3.247018      4.712008
## X419      2.498974       -2.432124      -4.371680      3.803393
## X421      2.946542       -2.459707      -3.888795      4.664478
## X422      2.637628       -2.272056      -2.948086      3.946432
## X423      2.773838       -2.218244      -3.909526      4.072581
## X424      2.951258       -2.401632      -3.572342      4.556042
## X425      2.950735       -2.230264      -3.971242      4.374653
## X426      3.057768       -2.511210      -5.167816      4.800985
## X427      2.706716       -2.321156      -3.252691      4.241448
## X429      2.810607       -2.507030      -5.073096      4.129067
## X430      2.871868       -2.538814      -4.444753      4.188577
## X431      3.114848       -2.307899      -2.778526      4.706382
## X432      2.872434       -2.249993      -3.412764      4.353519
## X433      2.972464       -2.177716      -3.606378      4.523594
## X435      2.829678       -2.416426      -4.095345      4.151454
## X436      2.976549       -2.244316      -4.022396      4.923849
## X437      2.972464       -2.392948      -4.296216      4.470584
## X438      2.771338       -2.470057      -4.346659      4.242305
## X439      2.975530       -2.443688      -4.514416      4.737167
## X440      2.751110       -2.529988      -4.684430      4.039126
## X441      2.844909       -2.417435      -3.070887      4.656584
## X443      2.759377       -2.428489      -4.460204      3.862936
## X446      3.214466       -2.273026      -3.899600      4.895332
## X447      3.333275       -2.302885      -3.904551      5.378924
## X448      2.871302       -2.388252      -4.447312      4.339596
## X449      2.962175       -2.478368      -3.888306      4.763445
## X450      3.021400       -2.334695      -3.875209      5.004371
## X451      3.069912       -2.716133      -2.802965      4.748958
## X452      3.218876       -2.271086      -3.948168      4.934138
## X453      3.340385       -2.472543      -3.653898      5.343130
## X456      3.424914       -2.381087      -4.385232      5.539246
## X457      3.377246       -2.369045      -3.722229      5.393426
## X458      3.228826       -2.431442      -4.818116      5.135645
## X459      3.224062       -2.480636      -4.713424      4.992489
## X461      3.301377       -2.312837      -3.808114      5.150996
## X464      2.910174       -2.464163      -4.073954      4.442448
## X465      2.902520       -2.594811      -4.204383      4.432205
## X466      3.002211       -2.490844      -2.709501      4.556042
## X467      3.032064       -2.444725      -3.448604      4.553042
## X468      2.895912       -2.487590      -4.376442      4.489157
## X469      3.149740       -2.376339      -2.655695      4.796239
## X470      2.900322       -2.141317      -3.440146      4.548535
## X471      2.917230       -2.413964      -3.917538      4.565019
## X473      2.703373       -2.513553      -4.034191      3.934694
## X474      3.400197       -2.564080      -4.798391      5.352397
## X475      2.748552       -2.295609      -3.476029      4.042868
## X476      2.755570       -2.403511      -4.019608      4.042868
## X477      3.021887       -2.415642      -3.414891      4.684455
## X478      2.810607       -2.684138      -4.331334      4.261073
## X479      2.680336       -2.257612      -3.856588      4.269554
## X480      2.970927       -2.276917      -2.905892      4.364923
## X481      2.892037       -2.398325      -4.094745      4.727414
## X482      2.956991       -2.526854      -4.526359      4.624021
## X483      2.643334       -2.233992      -4.232228      3.944480
## X485      2.423031       -2.260484      -4.006334      3.500171
## X486      2.797891       -2.352406      -2.594141      4.194706
## X487      2.824351       -2.448652      -4.273710      4.551541
## X488      2.934920       -2.217325      -3.563834      4.897287
## X490      3.005683       -2.590667      -4.236369      4.634199
## X491      3.110845       -2.502012      -4.418691      4.999375
## X492      2.582487       -2.546186      -4.794637      3.954223
## X495      3.022374       -2.612513      -3.886355      4.768255
## X496      3.006178       -2.344762      -4.289630      4.769627
## X499      2.863343       -2.290657      -3.437654      4.351068
## X500      3.055886       -2.221005      -3.578770      4.921270
## X503      2.792391       -2.155891      -4.156007      4.226835
## X504      2.987196       -2.370650      -3.394420      4.430625
## X505      2.554899       -1.811554      -3.091803      3.746469
## X506      2.575661       -2.075450      -3.018387      3.916970
## X508      2.840247       -2.125276      -3.751606      4.169207
## X510      3.175968       -2.134532      -3.007805      5.257064
## X511      2.687167       -2.513430      -3.240099      3.873042
## X512      2.687847       -2.468404      -4.425352      3.871024
## X513      3.021400       -2.201835      -3.787595      4.849274
## X514      2.614472       -2.319528      -4.010739      3.836443
## X515      2.948116       -2.384338      -4.116590      4.740641
## X516      2.923699       -2.254748      -4.109864      4.363297
## X517      3.024320       -2.236797      -3.975495      4.607940
## X518      3.008648       -2.266253      -3.770958      4.535719
## X519      2.902520       -2.105375      -3.759731      4.469807
## X520      2.815409       -2.184802      -4.159844      4.255969
## X521      2.631889       -1.987045      -3.673006      3.897110
## X522      3.072693       -2.273026      -3.438276      4.660893
## X523      2.987196       -2.463811      -5.132803      4.624750
## X524      2.927453       -2.311021      -3.793796      4.565765
## X528      2.507157       -2.407612      -4.691927      4.035378
## X529      2.577942       -2.081043      -3.427439      3.636967
## X530      2.598235       -2.207275      -4.462803      3.680332
## X533      2.793004       -2.377632      -4.929793      4.120954
## X535      2.869035       -2.334385      -3.387886      4.630569
## X536      3.037833       -2.257612      -3.643524      4.554542
## X538      3.196221       -2.090705      -3.353837      5.011847
## X539      3.238286       -2.513553      -4.636454      4.931570
## X540      3.236323       -2.445532      -2.740005      4.993116
## X541      2.670002       -2.304186      -3.191261      4.073504
## X542      3.218476       -2.426223      -3.067658      4.983068
## X543      3.235536       -2.491931      -4.446458      5.018060
## X545      3.030134       -2.345701      -3.829522      4.499157
## X547      2.794228       -2.360850      -4.927168      4.258523
## X549      2.962175       -2.466163      -4.489167      4.562778
## X552      3.110845       -2.346955      -3.489701      4.754487
## X553      3.382015       -2.491810      -4.395720      5.238363
## X554      3.088311       -2.381628      -3.998671      4.522074
## X555      3.364533       -2.510471      -3.838308      5.223531
## X558      3.327910       -2.510471      -4.488276      5.136237
## X559      3.121483       -2.468286      -3.070671      4.685165
## X560      3.175133       -2.379358      -3.512576      5.303509
## X561      3.301377       -2.309710      -3.620100      5.072078
## X562      3.379974       -2.597090      -4.724179      5.365966
## X563      3.421653       -2.255702      -3.027429      5.598355
## X564      3.222469       -2.208184      -3.144232      4.832614
## X565      3.108614       -2.198225      -3.543568      4.622564
## X566      3.341093       -2.324831      -3.720164      5.363258
## X567      3.335058       -2.470412      -3.288494      5.129122
## X568      3.378611       -2.138767      -2.787418      5.425895
## X569      3.200304       -2.944469      -5.368740      4.895984
## X570      2.339881       -2.133687      -3.015119      3.845649
## X571      2.877512       -2.468168      -4.336671      4.393994
## X572      3.056357       -2.210918      -3.217377      4.558289
## X573      3.014554       -1.948413      -2.595883      4.629842
## X574      2.663053       -2.299590      -3.704602      3.777223
## X576      2.994732       -2.357781      -4.281638      4.712710
## X577      3.036394       -2.129472      -3.496938      4.746189
## X579      3.179719       -2.131999      -2.628731      5.491708
## X580      3.145875       -2.500305      -4.681080      5.114832
## X582      3.210844       -2.328929      -2.489276      4.867801
## X584      3.118392       -2.179483      -2.824135      5.000625
## X585      3.315639       -2.172434      -3.160607      5.301845
## X586      3.002211       -2.315974      -4.455028      4.928999
## X587      3.029167       -2.145581      -3.688480      4.967287
## X588      3.097837       -2.319630      -3.967007      4.928999
## X591      2.520917       -2.278869      -4.246098      3.668189
## X592      2.657458       -2.232127      -2.932194      4.017490
## X593      3.137232       -2.361486      -4.374852      5.214935
## X594      3.062456       -2.188364      -3.972835      4.972347
## X595      2.797281       -2.131999      -3.270432      4.226835
## X596      3.069447       -2.249993      -3.488391      5.074506
## X597      3.008155       -2.360214      -3.603803      4.684455
## X598      3.229618       -2.223774      -3.487736      5.278432
## X599      2.711378       -2.318003      -3.495618      4.058702
## X600      3.223266       -2.240550      -3.389071      5.122583
## X602      3.177220       -2.122767      -3.479591      5.005619
## X604      2.883683       -2.263364      -3.551555      4.684455
## X605      3.072230       -2.342366      -3.689280      4.806397
## X606      3.078233       -2.320444      -3.508226      4.895332
## X607      2.913437       -2.409836      -5.318724      4.345339
## X608      3.226844       -2.365844      -4.515329      4.533450
## X609      3.035914       -2.286712      -3.799141      4.594701
## X610      3.071767       -2.505681      -4.508043      4.888151
## X612      3.211247       -2.398986      -2.296603      5.072078
## X613      3.009635       -2.262403      -3.841099      4.736472
## X614      3.082369       -2.331602      -4.280192      4.864503
## X615      2.867899       -2.208184      -3.220377      4.219926
## X616      2.823757       -2.453408      -4.106822      4.274627
## X618      2.683074       -2.272056      -4.249596      4.165667
## X620      3.072693       -2.449115      -4.629668      4.572474
## X621      2.793616       -2.565900      -4.442201      4.376271
## X622      2.903617       -2.493625      -4.781907      4.220791
## X623      2.928524       -2.164564      -3.519643      4.451081
## X624      3.091951       -2.401743      -4.575611      4.980549
## X626      2.921547       -2.250942      -3.769656      4.746189
## X627      3.072230       -2.174192      -3.556098      4.917397
## X629      2.467252       -2.327698      -4.551629      3.639212
## X630      2.700018       -2.176834      -4.510770      3.857866
## X631      3.043570       -2.085057      -3.453965      4.668773
## X633      2.629007       -2.561226      -3.234497      4.031624
## X634      3.171365       -2.187472      -3.631366      5.090232
## X635      3.175551       -2.143873      -3.767923      5.085404
## X636      3.044999       -2.259526      -4.042701      4.972347
## X637      2.946542       -2.508503      -4.686814      4.428254
## X638      2.852439       -2.238672      -2.452711      4.332192
## X639      2.802754       -2.319630      -5.182848      4.080869
## X640      3.059176       -2.406946      -4.103184      4.635650
## X641      2.683758       -2.324524      -2.367871      3.669295
## X642      3.199489       -2.233992      -2.879551      5.111247
## X643      2.759377       -2.295609      -3.880040      4.179793
## X644      2.804572       -2.389015      -4.006883      4.377888
## X645      2.978077       -2.389451      -3.815350      4.484527
## X647      2.781920       -2.239610      -2.840611      4.001364
## X648      3.176803       -2.051048      -2.683114      4.982438
## X650      3.043093       -2.205458      -4.071019      5.009980
## X651      2.763800       -2.227478      -3.331205      4.376271
## X652      3.215269       -2.241490      -2.865933      5.099260
## X653      3.269189       -2.107841      -2.805112      5.044600
## X654      2.750471       -2.330676      -4.010739      4.510643
## X655      2.918851       -2.315265      -4.105001      4.714115
## X656      3.066191       -2.359791      -3.512241      4.821893
## X658      3.081910       -2.433605      -3.691683      4.904441
## X659      2.723924       -2.178599      -3.120842      3.936655
## X660      3.178887       -2.410839      -4.007433      4.812472
## X661      3.125005       -2.385967      -3.711534      4.581390
## X663      2.906901       -2.280824      -4.205723      4.588794
## X664      2.987196       -2.264326      -3.292792      4.458901
## X666      2.881443       -2.258568      -4.440504      4.185067
## X667      2.992728       -2.278869      -4.224681      4.614531
## X668      2.552565       -2.409836      -4.319991      3.828226
## X669      2.984166       -2.327698      -3.542185      4.927712
## X670      3.218076       -2.355142      -4.207737      5.196499
## X671      2.597491       -2.145581      -4.524512      4.060557
## X672      3.021400       -2.524105      -5.099794      5.051957
## X673      2.965273       -2.297598      -3.818533      4.653708
## X674      2.959587       -2.303686      -3.808114      4.385955
## X675      2.744704       -1.967542      -3.536330      4.311499
## X676      2.908539       -2.169804      -3.767923      4.822564
## X678      2.979095       -2.020418      -2.445532      4.737167
## X680      2.832625       -2.266253      -3.529485      4.232863
## X681      3.033028       -2.309308      -3.208431      4.553792
## X682      2.978077       -2.546314      -2.597493      4.419537
## X683      3.005187       -2.187472      -3.284215      4.340417
## X684      2.761907       -2.162823      -3.810821      4.067964
## X686      2.757475       -2.357886      -2.694147      3.818947
## X688      3.131573       -2.158485      -3.224894      4.904441
## X690      2.381396       -2.367337      -4.180556      3.702239
## X691      2.840247       -2.249993      -3.862757      4.510643
## X692      3.005683       -1.933093      -2.322176      4.440088
## X693      2.387845       -2.206366      -4.361440      3.703328
## X694      2.796671       -2.642965      -3.420380      4.340417
## X696      3.206398       -2.379682      -4.444753      5.217803
## X697      2.939691       -2.498965      -3.600502      4.573218
## X698      2.796671       -2.162823      -3.173663      3.945456
## X699      3.223664       -2.287696      -3.448604      5.096856
## X700      2.587012       -2.238672      -3.706636      3.894116
## X702      3.069912       -2.294617      -3.936316      4.979920
## X703      2.634045       -2.357886      -4.189755      4.033502
## X705      3.112181       -2.401853      -4.421183      5.084195
## X707      2.733718       -2.339353      -4.185802      4.279690
## X708      2.866193       -2.148149      -3.358138      4.229421
## X710      2.482404       -2.380547      -5.175038      3.488165
## X711      2.893146       -2.330882      -3.943514      4.538741
## X712      2.851284       -2.214574      -4.054163      4.648665
## X713      2.767576       -2.444494      -4.158563      4.262772
## X714      2.706048       -2.551944      -4.027995      4.167437
## X716      2.808197       -2.215490      -3.315111      4.621105
## X717      2.932260       -2.508626      -3.020640      4.553792
## X718      2.719979       -2.305590      -3.773566      4.089126
## X719      2.885359       -2.532753      -4.140179      4.316482
## X720      3.033991       -2.175952      -4.472389      4.449513
## X721      3.030134       -2.363929      -2.915813      4.853256
## X722      2.730464       -2.233059      -2.344866      4.055916
## X723      2.571084       -2.327493      -4.712199      3.737909
## X724      2.730464       -2.366164      -3.903559      4.147887
## X725      2.887033       -2.447149      -4.159203      4.534963
## X727      2.968361       -2.597628      -3.626468      4.741335
## X728      2.544747       -2.373974      -4.626496      3.953251
## X729      2.561868       -2.588269      -5.093908      3.932732
## X730      3.004692       -2.217325      -3.727620      4.608673
## X731      2.768832       -2.442537      -3.488391      3.894116
## X732      2.898671       -2.189256      -3.782311      4.621834
## X733      3.100993       -2.290657      -3.449863      4.783310
## X735      2.983660       -2.474442      -4.830441      4.579906
## X736      2.273156       -2.344032      -4.623742      3.221497
## X737      2.933857       -2.423059      -3.700952      4.615263
## X738      3.205993       -2.254748      -3.314836      5.020540
## X739      2.830268       -2.317191      -4.290359      4.360856
## X740      2.516890       -2.274970      -4.439656      3.665973
## X741      2.977059       -2.402626      -4.544075      4.863182
## X742      2.475698       -2.073857      -3.763172      3.815845
## X743      2.688528       -2.296603      -3.853283      3.792962
## X744      2.718001       -2.431328      -4.588313      4.028805
## X745      2.670694       -2.392729      -4.827439      3.815845
## X746      2.893700       -2.333147      -2.429510      4.471360
## X748      3.100993       -2.772429      -6.095937      4.806397
## X749      2.569554       -2.437374      -5.057098      3.721768
## X750      3.085116       -2.212744      -3.674188      5.052569
## X751      3.279783       -2.170680      -3.045133      5.090835
## X753      2.702703       -2.401411      -3.289298      3.883102
## X754      3.109507       -2.401632      -4.272276      4.738557
## X755      2.715357       -2.378710      -4.422849      4.207786
## X756      2.922086       -2.454804      -4.690619      4.619646
## X757      2.844328       -2.325444      -4.743973      4.225973
## X758      2.855895       -2.295609      -4.735735      4.628388
## X759      2.766319       -2.515778      -3.811273      4.065190
## X760      3.140698       -2.230264      -1.999522      5.304618
## X761      3.063858       -2.436231      -4.022955      4.401206
## X762      2.902520       -2.666429      -5.000289      4.177151
## X764      3.144583       -2.259526      -2.971625      4.721123
## X766      3.104138       -2.120264      -3.396807      5.122583
## X767      3.083743       -2.607617      -2.938218      4.495315
## X768      3.113071       -2.462402      -3.297378      5.003747
## X769      3.006672       -2.315468      -4.137043      4.879637
## X770      2.973487       -2.344866      -4.029119      4.761381
## X771      2.961141       -2.411508      -3.661653      4.581390
## X772      3.283539       -2.170680      -2.971820      5.042143
## X773      3.167583       -2.022683      -3.471191      5.551376
## X774      2.923162       -2.306091      -3.957544      4.490698
## X776      2.848971       -2.217325      -4.382827      4.378696
## X777      3.008648       -2.433605      -4.197707      4.522074
## X778      3.115292       -2.300587      -3.492984      4.815841
## X779      2.558002       -2.503234      -4.393290      3.696783
## X781      2.941276       -2.422383      -4.058784      4.517508
## X782      2.916148       -2.169804      -3.585601      3.959079
## X783      3.241029       -2.296603      -2.458654      4.741335
## X784      3.170106       -2.357781      -3.294138      5.172099
## X785      2.829087       -2.276917      -3.370280      4.660893
## X787      2.861057       -2.519001      -3.467337      4.477566
## X788      3.070840       -2.366271      -3.451754      4.780580
## X789      3.480317       -2.474560      -3.632877      5.725074
## X790      2.577182       -2.338627      -4.077487      3.743263
## X791      2.631889       -2.252843      -3.766193      3.825137
## X793      3.008155       -2.277892      -3.740594      4.890764
## X794      2.834389       -2.471596      -4.506230      4.409194
## X795      2.600465       -2.312030      -4.248895      3.801311
## X797      2.741485       -2.480397      -3.439834      4.039126
## X798      3.176803       -2.537928      -3.774873      4.956498
## X799      3.105931       -2.218244      -3.255021      4.881604
## X800      2.948641       -2.170680      -3.908031      4.509879
## X801      3.298795       -2.676116      -4.106215      5.107058
## X802      3.520757       -2.553614      -4.988923      5.547844
## X803      3.325396       -2.390433      -3.881494      5.315680
## X804      2.766948       -2.469348      -4.565949      4.025039
## X805      3.056357       -2.400198      -4.280915      4.890111
## X806      3.294725       -2.352931      -3.553300      5.152173
## X807      3.066191       -2.482310      -3.463179      4.605738
## X808      3.326833       -2.498235      -3.559607      5.484477
## X810      2.747271       -2.362017      -4.383628      4.014653
## X811      2.710713       -2.535022      -5.312416      4.136255
## X812      2.900872       -2.344241      -2.827848      4.733688
## X814      3.157000       -2.275943      -3.425900      4.906389
## X815      2.988708       -2.234926      -4.278748      4.835955
## X817      2.646884       -2.434974      -2.883833      3.883102
## X819      2.703373       -2.289669      -4.399783      4.208655
## X821      2.915064       -2.370329      -4.542195      4.316482
## X822      2.986692       -2.242431      -2.984397      4.562778
## X824      2.961658       -2.268184      -4.017384      4.485299
## X825      2.836150       -2.210918      -3.619727      4.283900
## X826      3.359333       -2.379466      -3.043873      5.253674
## X827      2.848971       -2.013654      -3.081726      4.333015
## X828      3.144152       -2.199126      -2.814244      4.977398
## X829      3.513335       -2.241490      -3.666727      5.913428
## X830      3.298057       -2.302585      -4.149012      5.412105
## X831      3.138100       -2.446225      -4.504420      4.966653
## X832      3.096934       -2.408057      -2.816582      4.683033
## X833      2.964242       -2.545931      -4.698932      4.979289
## X834      3.094219       -2.330367      -4.417861      4.827259
## X835      3.437851       -2.357147      -4.214480      5.806493
## X836      2.941804       -2.334282      -3.329528      4.355967
## X837      3.083743       -2.531244      -3.727205      4.874383
## X838      2.785628       -2.361804      -3.826763      4.412381
## X839      3.015045       -2.223774      -3.039684      4.534207
## X840      2.822569       -2.744351      -5.596723      4.161235
## X841      2.568022       -2.319324      -4.490057      3.725005
## X842      3.044046       -2.364354      -3.003764      4.748958
## X843      2.751748       -2.403843      -4.540319      4.181552
## X844      3.197856       -2.424188      -4.449022      5.162741
## X845      2.854169       -2.099644      -4.207065      4.008967
## X846      2.650421       -2.366697      -4.710753      4.008967
## X847      2.994732       -2.416538      -4.497213      4.464360
## X848      2.881443       -2.532250      -5.244966      4.600594
## X849      2.719979       -2.352196      -4.172739      4.255969
## X851      2.640485       -2.549381      -4.239139      3.938613
## X852      2.900322       -2.266253      -3.817167      4.781263
## X853      2.932260       -2.238672      -3.357851      4.525113
## X855      2.912351       -2.477772      -4.827314      4.367359
## X856      3.033028       -2.452827      -2.965009      4.686585
## X857      2.574138       -2.665709      -4.308776      3.654863
## X858      2.993730       -2.523232      -2.493503      4.305672
## X859      2.938633       -2.440354      -4.272276      4.603535
## X860      2.982140       -2.435317      -2.240550      4.288943
## X861      2.949688       -2.408835      -3.856115      4.607206
## X862      2.773838       -2.297598      -3.910524      4.096440
## X863      2.859913       -2.480277      -4.199705      4.574706
## X864      2.623218       -2.336452      -4.439656      3.860909
## X866      2.513656       -2.462989      -4.405500      3.573135
## X868      2.899772       -2.721744      -4.285263      4.537986
## X869      3.139400       -2.287696      -4.238446      4.458120
## X871      2.990217       -2.470885      -3.390554      4.366547
## X872      3.172203       -2.225624      -3.050822      4.833951
## X873      2.923699       -2.236797      -4.671096      4.482982
## X874      2.899221       -2.424414      -3.629856      4.244873
## X875      3.198265       -2.593740      -3.756302      4.976136
## X877      2.667228       -2.658546      -5.321995      4.109184
## X878      2.542389       -2.606939      -5.587067      3.805473
## X879      2.627563       -2.482669      -5.357855      3.852784
## X880      2.950212       -2.428829      -4.698383      4.633474
## X881      2.753024       -2.574656      -5.112502      4.256821
## X883      2.372111       -2.453757      -4.312501      3.334618
## X884      2.923162       -2.231195      -4.266557      4.314822
## X885      2.824351       -2.463811      -5.805151      4.076268
## X887      2.937573       -2.328313      -4.098955      4.518270
## X888      2.939162       -2.305790      -2.719617      4.393191
## X889      2.833213       -2.582696      -4.529135      4.121857
## X890      2.783776       -2.243373      -3.154728      4.157683
## X891      2.978586       -2.523232      -4.285989      4.363297
## X892      2.589267       -2.176834      -3.904055      4.199074
## X893      3.068518       -2.145581      -3.716867      4.991235
## X894      2.721953       -2.444955      -4.255923      4.225110
## X897      2.886475       -2.566160      -4.978120      4.298995
## X898      3.030617       -2.146436      -3.871361      4.896635
## X899      3.085573       -2.149864      -3.281816      4.534207
## X900      2.741485       -2.354826      -3.428055      4.276316
## X901      2.962692       -2.345597      -3.424978      4.273783
## X902      2.988708       -2.249993      -4.506230      4.576936
## X903      2.693275       -2.488192      -4.944286      4.283059
## X904      2.945491       -2.487350      -4.775721      4.768255
## X907      3.064792       -2.395139      -3.244963      5.143922
## X908      2.863914       -2.295609      -4.234297      4.654427
## X909      3.189241       -2.235861      -3.926629      4.919334
## X910      2.805782       -2.327800      -3.489045      4.236301
## X911      2.823757       -2.467342      -3.360727      4.366547
## X912      2.705380       -2.270118      -3.975495      4.093700
## X914      2.737609       -2.162823      -4.512591      3.928802
## X915      2.688528       -2.314455      -3.063797      4.054986
## X916      2.939162       -2.478607      -4.513503      4.670202
## X918      2.774462       -2.399537      -4.762058      4.173623
## X920      2.837323       -2.582167      -4.963132      4.079030
## X921      2.955951       -2.085057      -2.724332      4.454212
## X924      2.644045       -2.620864      -3.385226      3.685830
## X926      2.922624       -2.223774      -3.236022      4.506820
## X927      2.785628       -2.436917      -4.835968      4.562030
## X929      2.907993       -2.293625      -4.712533      4.519792
## X930      2.894253       -2.598837      -5.361683      4.190330
## X931      3.071303       -2.455503      -3.889772      4.818532
## X933      2.906354       -2.334489      -4.014610      4.552291
## X934      2.830268       -2.533635      -4.382027      4.252561
## X935      3.080992       -2.391416      -3.960163      4.620376
## X937      2.891482       -2.382603      -4.277306      4.444020
## X938      2.847812       -2.366164      -4.499010      4.625478
## X939      3.086487       -2.241490      -3.568079      4.578422
## X940      3.148024       -2.328724      -3.239844      4.938626
## X942      2.714695       -2.301586      -3.621221      4.264469
## X943      2.853593       -2.359579      -3.965951      4.374653
## X944      2.776954       -2.488674      -4.143325      4.121857
## X945      2.776954       -2.314658      -4.010739      4.023154
## X946      3.006672       -2.399867      -2.571380      4.346158
## X947      3.339677       -2.588003      -4.420352      5.217230
## X948      2.718001       -2.492778      -3.511906      4.069812
## X949      2.935451       -2.107018      -3.090263      5.050733
## X951      2.703373       -2.527355      -3.961739      4.177151
## X952      3.123246       -2.668589      -3.087848      4.785356
## X954      2.618855       -2.481353      -3.876173      3.849729
## X955      3.148024       -2.443918      -4.050136      4.981809
## X957      2.782539       -2.655553      -4.258041      4.100089
## X958      2.740840       -2.481114      -2.707700      4.003267
## X959      3.144583       -2.292635      -3.194915      4.900541
## X960      2.503074       -2.302985      -4.294016      3.667081
## X961      2.823757       -2.264326      -3.929169      4.344519
## X962      2.994231       -2.154165      -3.530851      4.832614
## X963      3.103689       -2.148149      -3.289835      4.787400
## X964      2.874694       -2.273998      -4.115977      4.578422
## X965      2.843746       -2.520119      -4.363794      4.544020
## X966      2.938633       -2.245260      -3.933757      4.680188
## X968      2.696652       -2.558639      -4.220588      4.134460
## X969      2.848392       -2.398325      -3.930187      4.479114
## X970      3.045474       -2.095571      -3.291984      4.721123
## X971      2.389680       -2.422270      -4.419521      4.115529
## X972      2.906354       -2.610334      -3.293330      4.488386
## X975      2.922624       -2.298593      -3.847172      4.562030
## X977      3.061988       -2.583490      -3.105547      4.666626
## X978      3.028199       -2.267218      -3.585601      4.549287
## X979      2.885917       -2.443573      -3.767923      4.796917
## X981      2.823163       -2.228406      -4.671844      4.625478
## X982      3.076390       -2.529611      -3.621595      4.735776
## X983      3.096030       -2.463341      -3.572698      4.971715
## X984      3.394844       -2.486508      -4.249596      5.289608
## X986      3.077312       -2.259526      -4.089954      4.952042
## X988      2.498974       -2.432124      -4.371680      3.803393
## X989      3.063858       -2.284745      -4.662587      4.799630
## X990      2.946542       -2.459707      -3.888795      4.664478
## X991      2.637628       -2.272056      -2.948086      3.946432
## X992      2.773838       -2.218244      -3.909526      4.072581
## X994      2.950735       -2.230264      -3.971242      4.374653
## X995      3.057768       -2.511210      -5.167816      4.800985
## X996      2.706716       -2.321156      -3.252691      4.241448
## X997      3.090133       -2.430305      -3.840633      5.002499
## X998      2.810607       -2.507030      -5.073096      4.129067
## X999      2.871868       -2.538814      -4.444753      4.188577
## X1000     3.114848       -2.307899      -2.778526      4.706382
## X1001     2.872434       -2.249993      -3.412764      4.353519
## X1002     2.972464       -2.177716      -3.606378      4.523594
## X1003     3.089678       -2.284745      -3.197114      4.932212
## X1004     2.829678       -2.416426      -4.095345      4.151454
## X1005     2.976549       -2.244316      -4.022396      4.923849
## X1006     2.972464       -2.392948      -4.296216      4.470584
## X1007     2.771338       -2.470057      -4.346659      4.242305
## X1008     2.975530       -2.443688      -4.514416      4.737167
## X1009     2.751110       -2.529988      -4.684430      4.039126
## X1011     3.235536       -2.485187      -3.144696      5.207462
## X1012     2.759377       -2.428489      -4.460204      3.862936
## X1013     2.907993       -2.508134      -4.319240      4.385955
## X1014     2.824351       -2.413852      -4.160484      4.279690
## X1016     3.333275       -2.302885      -3.904551      5.378924
## X1017     2.871302       -2.388252      -4.447312      4.339596
## X1018     2.962175       -2.478368      -3.888306      4.763445
## X1019     3.021400       -2.334695      -3.875209      5.004371
## X1020     3.069912       -2.716133      -2.802965      4.748958
## X1021     3.218876       -2.271086      -3.948168      4.934138
## X1022     3.340385       -2.472543      -3.653898      5.343130
## X1024     2.841998       -2.455387      -4.763111      4.290621
## X1026     3.377246       -2.369045      -3.722229      5.393426
## X1027     3.228826       -2.431442      -4.818116      5.135645
## X1028     3.224062       -2.480636      -4.713424      4.992489
## X1029     3.339322       -2.527731      -4.366153      5.290165
## X1031     3.268428       -2.221927      -2.923598      4.960311
## X1032     3.295466       -2.659975      -4.086972      4.998750
## X1033     2.910174       -2.464163      -4.073954      4.442448
## X1035     3.002211       -2.490844      -2.709501      4.556042
## X1036     3.032064       -2.444725      -3.448604      4.553042
## X1037     2.895912       -2.487590      -4.376442      4.489157
## X1038     3.149740       -2.376339      -2.655695      4.796239
## X1039     2.900322       -2.141317      -3.440146      4.548535
## X1040     2.917230       -2.413964      -3.917538      4.565019
## X1041     3.337192       -2.435888      -4.163695      5.081777
## X1042     2.703373       -2.513553      -4.034191      3.934694
## X1044     2.748552       -2.295609      -3.476029      4.042868
## X1048     2.680336       -2.257612      -3.856588      4.269554
## X1049     2.970927       -2.276917      -2.905892      4.364923
## X1050     2.892037       -2.398325      -4.094745      4.727414
## X1051     2.956991       -2.526854      -4.526359      4.624021
## X1053     2.870169       -2.307598      -4.805330      4.403605
## X1054     2.423031       -2.260484      -4.006334      3.500171
## X1055     2.797891       -2.352406      -2.594141      4.194706
## X1056     2.824351       -2.448652      -4.273710      4.551541
## X1057     2.934920       -2.217325      -3.563834      4.897287
## X1058     2.783158       -2.182139      -4.348979      4.243161
## X1059     3.005683       -2.590667      -4.236369      4.634199
## X1062     3.023347       -2.301586      -3.611918      4.597650
## X1063     2.551786       -2.607481      -4.602175      3.744333
## X1064     3.022374       -2.612513      -3.886355      4.768255
## X1065     3.006178       -2.344762      -4.289630      4.769627
## X1067     2.851284       -2.415978      -4.289630      4.467474
## X1069     3.055886       -2.221005      -3.578770      4.921270
## X1070     2.817801       -2.314354      -4.006883      4.141631
## X1073     2.987196       -2.370650      -3.394420      4.430625
## X1074     2.554899       -1.811554      -3.091803      3.746469
## X1075     2.575661       -2.075450      -3.018387      3.916970
## X1076     2.997730       -2.210918      -3.817622      4.454212
## X1078     2.753661       -2.361592      -4.420352      3.889116
## X1079     3.175968       -2.134532      -3.007805      5.257064
## X1080     2.687167       -2.513430      -3.240099      3.873042
## X1081     2.687847       -2.468404      -4.425352      3.871024
## X1082     3.021400       -2.201835      -3.787595      4.849274
## X1083     2.614472       -2.319528      -4.010739      3.836443
## X1084     2.948116       -2.384338      -4.116590      4.740641
## X1085     2.923699       -2.254748      -4.109864      4.363297
## X1086     3.024320       -2.236797      -3.975495      4.607940
## X1087     3.008648       -2.266253      -3.770958      4.535719
## X1088     2.902520       -2.105375      -3.759731      4.469807
## X1089     2.815409       -2.184802      -4.159844      4.255969
## X1091     3.072693       -2.273026      -3.438276      4.660893
## X1092     2.987196       -2.463811      -5.132803      4.624750
## X1093     2.927453       -2.311021      -3.793796      4.565765
## X1095     2.572612       -2.267218      -4.038721      3.957138
## X1096     2.931194       -2.230264      -4.238446      4.530422
## X1099     2.598235       -2.207275      -4.462803      3.680332
## X1100     2.865624       -2.232127      -4.009085      4.735080
## X1101     2.996732       -2.286712      -4.058784      4.792163
## X1102     2.793004       -2.377632      -4.929793      4.120954
## X1103     3.028683       -2.390761      -3.477323      4.676626
## X1104     2.869035       -2.334385      -3.387886      4.630569
## X1107     3.196221       -2.090705      -3.353837      5.011847
## X1108     3.238286       -2.513553      -4.636454      4.931570
## X1110     2.670002       -2.304186      -3.191261      4.073504
## X1111     3.218476       -2.426223      -3.067658      4.983068
## X1112     3.235536       -2.491931      -4.446458      5.018060
## X1114     3.030134       -2.345701      -3.829522      4.499157
## X1115     3.145445       -2.380979      -3.863709      4.811124
## X1116     2.794228       -2.360850      -4.927168      4.258523
## X1117     2.808197       -2.421707      -3.478943      4.281375
## X1118     2.962175       -2.466163      -4.489167      4.562778
## X1119     3.186766       -2.502012      -3.979232      4.965386
## X1120     3.067122       -2.599510      -4.506230      4.500691
## X1121     3.110845       -2.346955      -3.489701      4.754487
## X1122     3.382015       -2.491810      -4.395720      5.238363
## X1123     3.088311       -2.381628      -3.998671      4.522074
## X1124     3.364533       -2.510471      -3.838308      5.223531
## X1125     3.318178       -2.404618      -3.598673      5.175599
## X1126     2.975019       -2.299590      -3.806762      4.351068
## X1127     3.327910       -2.510471      -4.488276      5.136237
## X1128     3.121483       -2.468286      -3.070671      4.685165
## X1130     3.301377       -2.309710      -3.620100      5.072078
## X1131     3.379974       -2.597090      -4.724179      5.365966
## X1132     3.421653       -2.255702      -3.027429      5.598355
## X1133     3.222469       -2.208184      -3.144232      4.832614
## X1134     3.108614       -2.198225      -3.543568      4.622564
## X1135     3.341093       -2.324831      -3.720164      5.363258
## X1137     3.378611       -2.138767      -2.787418      5.425895
##       smoothness_worst symmetry_worst
## X1           -1.401837     -0.9485186
## X2           -1.552206     -1.8138504
## X3           -1.468032     -1.3273311
## X4           -1.246824     -0.4547732
## X5           -1.495633     -2.1134503
## X6           -1.343543     -1.1682237
## X7           -1.468808     -1.6137366
## X9           -1.373392     -1.0226796
## X10          -1.323124     -1.0268307
## X11          -1.577215     -1.6835473
## X12          -1.486854     -1.2478490
## X14          -1.599859     -1.7735849
## X15          -1.391541     -1.3351867
## X17          -1.460319     -1.6339618
## X18          -1.344209     -1.2853171
## X20          -1.469584     -1.6655621
## X21          -1.520913     -1.5444060
## X22          -1.515956     -2.0406102
## X23          -1.489239     -0.9275957
## X25          -1.338889     -1.3273311
## X26          -1.429814     -1.1365073
## X27          -1.437240     -1.0628195
## X28          -1.510212     -2.1336080
## X29          -1.395077     -1.1516587
## X30          -1.544904     -1.8096966
## X32          -1.396495     -0.8985507
## X33          -1.397561     -1.3662211
## X34          -1.443230     -1.3004918
## X35          -1.467258     -1.0606668
## X36          -1.423188     -0.8679915
## X38          -1.677854     -2.4867416
## X39          -1.694115     -3.0556014
## X40          -1.406135     -1.7749290
## X41          -1.617070     -1.6551407
## X43          -1.548331     -0.9266552
## X45          -1.445488     -1.2910944
## X47          -1.527155     -1.5892127
## X49          -1.448886     -1.8159323
## X50          -1.585739     -1.7326167
## X51          -1.621318     -2.0547020
## X52          -1.619427     -2.1292007
## X53          -1.593905     -1.7898096
## X55          -1.489637     -1.8669460
## X56          -1.547473     -1.4783924
## X57          -1.401122     -1.3628885
## X58          -1.498044     -1.2888687
## X59          -1.652261     -2.0497116
## X60          -1.363097     -1.5245369
## X62          -1.395786     -1.6686442
## X63          -1.395431     -1.7502930
## X64          -1.670976     -1.4910873
## X66          -1.392600     -1.4705280
## X67          -1.428706     -1.7280747
## X68          -1.530085     -2.0824829
## X73          -1.415161     -1.4747157
## X74          -1.480924     -1.9306463
## X75          -1.579449     -1.9088155
## X76          -1.446619     -1.8851434
## X77          -1.465324     -1.8418938
## X78          -1.454964     -1.2655509
## X79          -1.395786     -0.7116307
## X80          -1.530504     -1.7938986
## X81          -1.425391     -1.8055564
## X83          -1.419530     -2.1213029
## X84          -1.488443     -2.1603515
## X85          -1.494430     -1.4406136
## X86          -1.486061     -1.2902036
## X87          -1.523404     -1.6393726
## X88          -1.547473     -1.1798150
## X90          -1.535556     -1.5629177
## X91          -1.607252     -1.9825153
## X92          -1.544049     -1.9559389
## X94          -1.509803     -1.8647794
## X95          -1.427599     -1.7569038
## X96          -1.573211     -1.2928782
## X97          -1.595732     -2.2380872
## X99          -1.473086     -1.7986859
## X100         -1.473086     -1.8362356
## X101         -1.540640     -1.8844106
## X102         -1.415525     -1.6935843
## X105         -1.560451     -1.7622177
## X106         -1.320199     -1.5651813
## X108         -1.575878     -1.6618738
## X110         -1.374774     -1.7602223
## X111         -1.459169     -1.9738585
## X112         -1.531344     -2.2390390
## X113         -1.717446     -2.0841850
## X114         -1.525902     -2.0970190
## X115         -1.366172     -1.6973693
## X117         -1.578108     -2.9206783
## X118         -1.315025     -1.3402996
## X119         -1.322473     -1.4953499
## X120         -1.627498     -0.8624052
## X121         -1.428706     -1.6417852
## X122         -1.445488     -1.7177547
## X124         -1.520499     -1.7209705
## X126         -1.587999     -2.1134503
## X127         -1.457637     -1.3951992
## X128         -1.652261     -1.7522726
## X131         -1.474648     -1.3956885
## X132         -1.429444     -1.7549169
## X133         -1.487251     -1.3903175
## X134         -1.558708     -1.8327119
## X135         -1.459935     -1.5869030
## X137         -1.538094     -2.8336824
## X138         -1.573211     -1.8662234
## X140         -1.498446     -2.3622810
## X141         -1.553935     -1.5892127
## X142         -1.520085     -1.7850558
## X143         -1.480137     -1.9344474
## X144         -1.520913     -1.3571983
## X145         -1.625591     -2.1702883
## X146         -1.475821     -1.8298999
## X148         -1.664214     -1.7450294
## X151         -1.527155     -1.5377457
## X152         -1.398983     -1.4700056
## X153         -1.529246     -1.5874800
## X154         -1.508987     -1.7404419
## X155         -1.454964     -1.2237105
## X156         -1.561324     -1.5845978
## X157         -1.478172     -2.0299327
## X158         -1.726991     -1.9785734
## X159         -1.536401     -1.9888468
## X160         -1.581689     -1.8221988
## X161         -1.502079     -1.5533453
## X163         -1.470361     -1.3136025
## X164         -1.464938     -2.1996053
## X165         -1.556535     -1.3384376
## X166         -1.640502     -1.8880790
## X168         -1.583037     -1.7729134
## X169         -1.494831     -2.3033148
## X170         -1.561761     -2.0790851
## X171         -1.491231     -1.7615522
## X172         -1.484873     -1.7241946
## X173         -1.435005     -1.5267281
## X174         -1.561761     -2.5859017
## X175         -1.625591     -1.8418938
## X176         -1.585739     -1.9283710
## X177         -1.525485     -1.9118051
## X178         -1.479351     -1.6190575
## X179         -1.763600     -2.1748286
## X180         -1.585739     -2.7364649
## X181         -1.457254     -1.7424059
## X182         -1.450022     -1.1242373
## X183         -1.479744     -1.4114596
## X184         -1.604471     -2.6997069
## X186         -1.438733     -1.6929546
## X187         -1.553935     -1.5322240
## X188         -1.516368     -1.9436150
## X189         -1.502888     -1.5355339
## X190         -1.616129     -2.0144782
## X192         -1.694063     -2.2845122
## X193         -1.824755     -2.5774861
## X195         -1.519257     -1.6514837
## X196         -1.615659     -1.6369648
## X198         -1.724360     -2.1091071
## X199         -1.516780     -1.5394073
## X202         -1.492829     -1.6961064
## X203         -1.433147     -1.5366393
## X204         -1.209422     -1.0042088
## X205         -1.475039     -1.6429933
## X206         -1.450022     -1.4224305
## X207         -1.479351     -1.6581966
## X208         -1.609112     -1.4948162
## X210         -1.559143     -2.1522740
## X212         -1.534290     -1.9722906
## X214         -1.550051     -2.9953191
## X215         -1.424656     -0.9098798
## X216         -1.461856     -1.3195307
## X218         -1.686819     -1.7345684
## X219         -1.504104     -1.6096144
## X220         -1.482107     -1.8397690
## X224         -1.427231     -1.1650483
## X225         -1.535978     -1.9952086
## X226         -1.527155     -1.6143267
## X227         -1.508987     -1.9110570
## X229         -1.556969     -1.7622177
## X230         -1.348222     -1.4264463
## X231         -1.373392     -1.5869030
## X232         -1.703821     -1.7470007
## X233         -1.663010     -1.7068831
## X234         -1.558708     -2.0513730
## X235         -1.445111     -1.8090056
## X236         -1.531344     -2.2390390
## X238         -1.556535     -2.2276590
## X240         -1.499252     -1.7443730
## X241         -1.535133     -1.9762139
## X243         -1.506542     -1.4773407
## X244         -1.695111     -1.8756488
## X245         -1.460703     -1.7011661
## X247         -1.613783     -1.8021165
## X248         -1.545331     -1.8932321
## X249         -1.446997     -1.4254410
## X250         -1.489637     -1.8749213
## X251         -1.563950     -1.5771365
## X252         -1.576324     -1.8208036
## X253         -1.370978     -1.8145440
## X254         -1.478958     -1.5702903
## X255         -1.447752     -1.4406136
## X257         -1.533868     -1.7675541
## X258         -1.352253     -1.5039222
## X260         -1.313415     -1.3748365
## X261         -1.438360     -1.5629177
## X264         -1.621791     -1.8611765
## X265         -1.425023     -1.5267281
## X267         -1.563074     -1.6885556
## X268         -1.660208     -2.0439127
## X269         -1.544476     -1.3314830
## X270         -1.511439     -1.9185567
## X271         -1.738456     -2.0340294
## X273         -1.537670     -1.7575668
## X276         -1.501675     -2.2447636
## X277         -1.543196     -1.8083150
## X278         -1.550051     -1.9474538
## X279         -1.638076     -2.1389154
## X280         -1.575433     -1.6791818
## X281         -1.346547     -1.5039222
## X282         -1.644889     -1.5915268
## X283         -1.439855     -1.3379726
## X284         -1.499252     -1.8000570
## X286         -1.643912     -1.9960060
## X287         -1.606788     -2.0282975
## X288         -1.682219     -2.1621529
## X289         -1.648811     -1.6791818
## X290         -1.605860     -1.4990925
## X291         -1.663562     -2.1959068
## X292         -1.520499     -1.6748318
## X293         -1.453060     -1.4401046
## X294         -1.497641     -1.5915268
## X295         -1.526737     -2.1091071
## X296         -1.582138     -1.7642162
## X297         -1.698055     -2.3203498
## X298         -1.597105     -2.4969375
## X299         -1.691083     -1.8954469
## X301         -1.448508     -1.6711155
## X302         -1.638076     -1.8597382
## X303         -1.506542     -1.4847225
## X304         -1.480137     -2.2514717
## X305         -1.593905     -2.2562827
## X306         -1.685886     -1.5114749
## X307         -1.601239     -1.8844106
## X309         -1.713449     -2.2005314
## X310         -1.654735     -2.3571020
## X311         -1.558708     -1.3869129
## X312         -1.657217     -1.9762139
## X313         -1.571438     -1.9497625
## X314         -1.618012     -1.4350269
## X315         -1.506542     -1.5680169
## X316         -1.612379     -2.5679247
## X317         -1.662208     -2.1766489
## X318         -1.471917     -1.7715714
## X319         -1.559579     -1.5719981
## X320         -1.747337     -2.5871077
## X321         -1.484477     -1.9163022
## X322         -1.636142     -1.6184651
## X323         -1.429075     -2.0978789
## X324         -1.412624     -0.6826914
## X325         -1.544476     -1.8771050
## X326         -1.491630     -1.8575837
## X327         -1.533868     -2.3643578
## X329         -1.413348     -1.5903692
## X330         -1.475039     -1.8235956
## X331         -1.471528     -1.6399753
## X332         -1.530924     -1.3351867
## X333         -1.475821     -1.4857809
## X334         -1.583937     -1.7695612
## X335         -1.559143     -1.9574875
## X336         -1.466097     -1.9050882
## X337         -1.594818     -2.0555355
## X338         -1.447374     -1.2973504
## X339         -1.484477     -1.7177547
## X340         -1.440979     -1.9276134
## X341         -1.528409     -1.6196501
## X342         -1.554367     -1.6624877
## X343         -1.478172     -1.4810257
## X344         -1.560888     -1.1446385
## X346         -1.501270     -2.0538690
## X347         -1.530504     -1.7267800
## X348         -1.560015     -1.5869030
## X350         -1.528827     -1.5782813
## X351         -1.664817     -1.8270941
## X352         -1.427968     -1.0696662
## X353         -1.435377     -1.2924320
## X354         -1.388371     -1.8822146
## X355         -1.722066     -1.9405520
## X357         -1.508171     -1.5845978
## X358         -1.589811     -2.1151915
## X359         -1.655231     -2.0538690
## X360         -1.512259     -2.0373158
## X361         -1.699057     -2.2323896
## X362         -1.595732     -1.8947083
## X364         -1.510212     -2.0875956
## X366         -1.514721     -1.9155516
## X367         -1.535133     -1.4969524
## X369         -1.500059     -1.9920239
## X370         -1.528409     -1.8201066
## X371         -1.479351     -0.8795614
## X372         -1.602161     -2.0104407
## X373         -1.572324     -1.8277950
## X374         -1.423555     -1.8568664
## X375         -1.611911     -1.4694835
## X376         -1.553502     -1.5617875
## X377         -1.594361     -1.9245875
## X378         -1.610510     -1.8532856
## X379         -1.536401     -1.4365479
## X380         -1.221525     -1.1031070
## X381         -1.406135     -1.4590896
## X384         -1.474648     -1.7668858
## X385         -1.583937     -1.8235956
## X386         -1.520913     -2.0185279
## X389         -1.613314     -2.3063064
## X390         -1.546616     -1.9405520
## X391         -1.508171     -1.6904389
## X392         -1.461856     -2.0447396
## X393         -1.381022     -1.5427374
## X394         -1.445865     -1.2325409
## X395         -1.472696     -1.6220237
## X396         -1.630368     -1.9817260
## X397         -1.474257     -1.8734676
## X398         -1.686456     -2.4856130
## X399         -1.604934     -1.9298875
## X400         -1.549621     -1.7938986
## X402         -1.495633     -2.0323892
## X404         -1.581241     -1.4831367
## X405         -1.639046     -2.1721026
## X406         -1.500059     -2.2154336
## X407         -1.566145     -1.7945814
## X408         -1.693330     -2.0096347
## X409         -1.453440     -1.6155076
## X410         -1.551343     -1.4025614
## X411         -1.464552     -1.6680271
## X412         -1.497641     -1.6527015
## X413         -1.620845     -2.1030497
## X414         -1.625115     -1.5561527
## X415         -1.592083     -1.5174436
## X416         -1.475039     -1.6066785
## X418         -1.440230     -1.6496593
## X419         -1.520085     -1.7966320
## X421         -1.547473     -1.6303680
## X422         -1.520913     -1.7615522
## X423         -1.479351     -1.7884496
## X424         -1.592538     -1.8180177
## X425         -1.486061     -1.5377457
## X426         -1.602161     -2.1265631
## X427         -1.514721     -1.6393726
## X429         -1.647829     -2.0970190
## X430         -1.629410     -2.2983428
## X431         -1.477780     -1.7358713
## X432         -1.465711     -1.9559389
## X433         -1.386615     -1.6321636
## X435         -1.560888     -1.9801488
## X436         -1.405058     -1.5471923
## X437         -1.562636     -1.4747157
## X438         -1.570996     -1.8313051
## X439         -1.605860     -1.9896404
## X440         -1.645868     -2.3274232
## X441         -1.488841     -1.9683790
## X443         -1.627498     -2.6386361
## X446         -1.521328     -1.9230772
## X447         -1.484873     -1.6686442
## X448         -1.557403     -1.3333333
## X449         -1.619427     -2.0234038
## X450         -1.498044     -2.1996053
## X451         -1.690457     -2.1657627
## X452         -1.436122     -2.1766489
## X453         -1.565266     -2.0430863
## X456         -1.581241     -2.2678960
## X457         -1.482896     -1.7241946
## X458         -1.530504     -2.0455670
## X459         -1.560888     -2.1648594
## X461         -1.409733     -1.6454131
## X464         -1.508579     -1.5300225
## X465         -1.533447     -2.2304954
## X466         -1.568346     -1.7496338
## X467         -1.504916     -1.9505330
## X468         -1.490035     -1.6172812
## X469         -1.565705     -2.1693820
## X470         -1.347217     -1.8778337
## X471         -1.553935     -1.5499851
## X473         -1.630847     -1.8575837
## X474         -1.692284     -2.0748497
## X475         -1.512668     -1.9367333
## X476         -1.515133     -1.6478377
## X477         -1.589811     -1.9730743
## X478         -1.694272     -1.8640580
## X479         -1.504510     -1.6879284
## X480         -1.494430     -1.4720966
## X481         -1.565266     -2.0773893
## X482         -1.631327     -2.1204282
## X483         -1.488046     -1.5207121
## X485         -1.431294     -1.9551652
## X486         -1.579897     -1.5185321
## X487         -1.594818     -2.0364934
## X488         -1.441353     -1.4996282
## X490         -1.658213     -0.9244642
## X491         -1.544476     -1.5921060
## X492         -1.669659     -2.7364649
## X495         -1.605860     -1.8583015
## X496         -1.561761     -2.1091071
## X499         -1.480531     -1.9920239
## X500         -1.460319     -2.0160966
## X503         -1.416979     -1.5606584
## X504         -1.569228     -1.7087947
## X505         -1.307322     -1.6285751
## X506         -1.274705     -1.7476584
## X508         -1.387668     -1.7932162
## X510         -1.363438     -1.6435978
## X511         -1.627020     -1.9193090
## X512         -1.596189     -2.1398020
## X513         -1.419165     -1.3402996
## X514         -1.558708     -1.9028570
## X515         -1.548761     -2.1867032
## X516         -1.453060     -1.6155076
## X517         -1.449644     -1.6072651
## X518         -1.478565     -1.9613670
## X519         -1.463396     -1.9359709
## X520         -1.456108     -1.6090266
## X521         -1.324101     -1.2964545
## X522         -1.508579     -1.5595304
## X523         -1.610510     -1.9551652
## X524         -1.475430     -1.7470007
## X528         -1.529246     -1.5863263
## X529         -1.487648     -2.3033148
## X530         -1.438733     -1.7925342
## X533         -1.541066     -1.7756016
## X535         -1.474257     -2.1802966
## X536         -1.539366     -1.6055062
## X538         -1.351243     -1.7776215
## X539         -1.544476     -1.6166897
## X540         -1.411177     -1.7864122
## X541         -1.507356     -2.1442434
## X542         -1.509395     -1.5427374
## X543         -1.633249     -1.8334159
## X545         -1.541066     -2.2173075
## X547         -1.532184     -1.8626165
## X549         -1.569228     -1.9590379
## X552         -1.620372     -1.5527846
## X553         -1.553935     -2.0765423
## X554         -1.612846     -2.0530365
## X555         -1.556969     -2.1065078
## X558         -1.627020     -2.0201513
## X559         -1.649795     -2.2088944
## X560         -1.526737     -2.3519414
## X561         -1.550912     -2.2163702
## X562         -1.700430     -3.0539870
## X563         -1.478565     -1.1276737
## X564         -1.482501     -1.6954754
## X565         -1.481318     -2.4065265
## X566         -1.583937     -1.9436150
## X567         -1.596189     -2.2466769
## X568         -1.391894     -1.1284389
## X569         -1.714905     -1.7326167
## X570         -1.401837     -0.9485186
## X571         -1.552206     -1.8138504
## X572         -1.468032     -1.3273311
## X573         -1.246824     -0.4547732
## X574         -1.495633     -2.1134503
## X576         -1.468808     -1.6137366
## X577         -1.390483     -1.5377457
## X579         -1.323124     -1.0268307
## X580         -1.577215     -1.6835473
## X582         -1.644401     -1.5488672
## X584         -1.391541     -1.3351867
## X585         -1.382068     -1.0794752
## X586         -1.460319     -1.6339618
## X587         -1.344209     -1.2853171
## X588         -1.442104     -1.8014296
## X591         -1.515956     -2.0406102
## X592         -1.489239     -0.9275957
## X593         -1.484873     -1.7648831
## X594         -1.338889     -1.3273311
## X595         -1.429814     -1.1365073
## X596         -1.437240     -1.0628195
## X597         -1.510212     -2.1336080
## X598         -1.395077     -1.1516587
## X599         -1.544904     -1.8096966
## X600         -1.450022     -1.4079909
## X602         -1.397561     -1.3662211
## X604         -1.467258     -1.0606668
## X605         -1.423188     -0.8679915
## X606         -1.467258     -1.3375078
## X607         -1.677854     -2.4867416
## X608         -1.694115     -3.0556014
## X609         -1.406135     -1.7749290
## X610         -1.617070     -1.6551407
## X612         -1.548331     -0.9266552
## X613         -1.435377     -1.2707870
## X614         -1.445488     -1.2910944
## X615         -1.381719     -1.2448554
## X616         -1.527155     -1.5892127
## X618         -1.448886     -1.8159323
## X620         -1.621318     -2.0547020
## X621         -1.619427     -2.1292007
## X622         -1.593905     -1.7898096
## X623         -1.534290     -1.6387702
## X624         -1.489637     -1.8669460
## X626         -1.401122     -1.3628885
## X627         -1.498044     -1.2888687
## X629         -1.363097     -1.5245369
## X630         -1.536401     -1.3534209
## X631         -1.395786     -1.6686442
## X633         -1.670976     -1.4910873
## X634         -1.323775     -1.4385789
## X635         -1.392600     -1.4705280
## X636         -1.428706     -1.7280747
## X637         -1.530085     -2.0824829
## X638         -1.453440     -1.0758313
## X639         -1.527572     -2.0970190
## X640         -1.571881     -1.9598138
## X641         -1.565705     -2.2126273
## X642         -1.415161     -1.4747157
## X643         -1.480924     -1.9306463
## X644         -1.579449     -1.9088155
## X645         -1.446619     -1.8851434
## X647         -1.454964     -1.2655509
## X648         -1.395786     -0.7116307
## X650         -1.425391     -1.8055564
## X651         -1.433147     -1.3676525
## X652         -1.419530     -2.1213029
## X653         -1.488443     -2.1603515
## X654         -1.494430     -1.4406136
## X655         -1.486061     -1.2902036
## X656         -1.523404     -1.6393726
## X658         -1.524236     -1.6686442
## X659         -1.535556     -1.5629177
## X660         -1.607252     -1.9825153
## X661         -1.544049     -1.9559389
## X663         -1.509803     -1.8647794
## X664         -1.427599     -1.7569038
## X666         -1.595732     -2.2380872
## X667         -1.519257     -2.5478042
## X668         -1.473086     -1.7986859
## X669         -1.473086     -1.8362356
## X670         -1.540640     -1.8844106
## X671         -1.415525     -1.6935843
## X672         -1.603546     -1.8532856
## X673         -1.424656     -1.9058328
## X674         -1.560451     -1.7622177
## X675         -1.320199     -1.5651813
## X676         -1.378587     -1.7756016
## X678         -1.374083     -1.1407587
## X680         -1.459169     -1.9738585
## X681         -1.531344     -2.2390390
## X682         -1.717446     -2.0841850
## X683         -1.525902     -2.0970190
## X684         -1.366172     -1.6973693
## X686         -1.578108     -2.9206783
## X688         -1.322473     -1.4953499
## X690         -1.428706     -1.6417852
## X691         -1.445488     -1.7177547
## X692         -1.375812     -1.5234428
## X693         -1.520499     -1.7209705
## X694         -1.650288     -2.4194174
## X696         -1.457637     -1.3951992
## X697         -1.652261     -1.7522726
## X698         -1.490832     -1.9298875
## X699         -1.536401     -1.4789186
## X700         -1.474648     -1.3956885
## X702         -1.487251     -1.3903175
## X703         -1.558708     -1.8327119
## X705         -1.477780     -1.7602223
## X707         -1.573211     -1.8662234
## X708         -1.480924     -1.4229317
## X710         -1.553935     -1.5892127
## X711         -1.520085     -1.7850558
## X712         -1.480137     -1.9344474
## X713         -1.520913     -1.3571983
## X714         -1.625591     -2.1702883
## X716         -1.491231     -0.6320347
## X717         -1.664214     -1.7450294
## X718         -1.519257     -1.8554329
## X719         -1.677342     -2.1256850
## X720         -1.527155     -1.5377457
## X721         -1.398983     -1.4700056
## X722         -1.529246     -1.5874800
## X723         -1.508987     -1.7404419
## X724         -1.454964     -1.2237105
## X725         -1.561324     -1.5845978
## X727         -1.726991     -1.9785734
## X728         -1.536401     -1.9888468
## X729         -1.581689     -1.8221988
## X730         -1.502079     -1.5533453
## X731         -1.603084     -2.0463949
## X732         -1.470361     -1.3136025
## X733         -1.464938     -2.1996053
## X735         -1.640502     -1.8880790
## X736         -1.471139     -2.3747864
## X737         -1.583037     -1.7729134
## X738         -1.494831     -2.3033148
## X739         -1.561761     -2.0790851
## X740         -1.491231     -1.7615522
## X741         -1.484873     -1.7241946
## X742         -1.435005     -1.5267281
## X743         -1.561761     -2.5859017
## X744         -1.625591     -1.8418938
## X745         -1.585739     -1.9283710
## X746         -1.525485     -1.9118051
## X748         -1.763600     -2.1748286
## X749         -1.585739     -2.7364649
## X750         -1.457254     -1.7424059
## X751         -1.450022     -1.1242373
## X753         -1.604471     -2.6997069
## X754         -1.525485     -1.5494260
## X755         -1.438733     -1.6929546
## X756         -1.553935     -1.5322240
## X757         -1.516368     -1.9436150
## X758         -1.502888     -1.5355339
## X759         -1.616129     -2.0144782
## X760         -1.434262     -0.7826129
## X761         -1.694063     -2.2845122
## X762         -1.824755     -2.5774861
## X764         -1.519257     -1.6514837
## X766         -1.361734     -1.6037499
## X767         -1.724360     -2.1091071
## X768         -1.516780     -1.5394073
## X769         -1.427231     -0.9009890
## X770         -1.473867     -1.8720155
## X771         -1.492829     -1.6961064
## X772         -1.433147     -1.5366393
## X773         -1.209422     -1.0042088
## X774         -1.475039     -1.6429933
## X776         -1.479351     -1.6581966
## X777         -1.609112     -1.4948162
## X778         -1.505728     -1.1128640
## X779         -1.559143     -2.1522740
## X781         -1.534290     -1.9722906
## X782         -1.594818     -2.9266464
## X783         -1.550051     -2.9953191
## X784         -1.424656     -0.9098798
## X785         -1.461856     -1.3195307
## X787         -1.686819     -1.7345684
## X788         -1.504104     -1.6096144
## X789         -1.482107     -1.8397690
## X790         -1.521328     -2.0996003
## X791         -1.494831     -1.6125574
## X793         -1.427231     -1.1650483
## X794         -1.535978     -1.9952086
## X795         -1.527155     -1.6143267
## X797         -1.597563     -1.6798045
## X798         -1.556969     -1.7622177
## X799         -1.348222     -1.4264463
## X800         -1.373392     -1.5869030
## X801         -1.703821     -1.7470007
## X802         -1.663010     -1.7068831
## X803         -1.558708     -2.0513730
## X804         -1.445111     -1.8090056
## X805         -1.531344     -2.2390390
## X806         -1.453821     -1.5903692
## X807         -1.556535     -2.2276590
## X808         -1.623214     -2.6004371
## X810         -1.535133     -1.9762139
## X811         -1.644401     -1.7132666
## X812         -1.506542     -1.4773407
## X814         -1.460703     -1.7011661
## X815         -1.440979     -1.7248404
## X817         -1.545331     -1.8932321
## X819         -1.489637     -1.8749213
## X821         -1.576324     -1.8208036
## X822         -1.370978     -1.8145440
## X824         -1.447752     -1.4406136
## X825         -1.442104     -1.6107907
## X826         -1.533868     -1.7675541
## X827         -1.352253     -1.5039222
## X828         -1.445111     -1.4937496
## X829         -1.313415     -1.3748365
## X830         -1.438360     -1.5629177
## X831         -1.551343     -2.0389620
## X832         -1.598480     -1.6113793
## X833         -1.621791     -1.8611765
## X834         -1.425023     -1.5267281
## X835         -1.484873     -1.7345684
## X836         -1.563074     -1.6885556
## X837         -1.660208     -2.0439127
## X838         -1.544476     -1.3314830
## X839         -1.511439     -1.9185567
## X840         -1.738456     -2.0340294
## X841         -1.502079     -1.8256936
## X842         -1.537670     -1.7575668
## X843         -1.459169     -1.7522726
## X844         -1.519671     -1.9968038
## X845         -1.501675     -2.2447636
## X846         -1.543196     -1.8083150
## X847         -1.550051     -1.9474538
## X848         -1.638076     -2.1389154
## X849         -1.575433     -1.6791818
## X851         -1.644889     -1.5915268
## X852         -1.439855     -1.3379726
## X853         -1.499252     -1.8000570
## X855         -1.643912     -1.9960060
## X856         -1.606788     -2.0282975
## X857         -1.682219     -2.1621529
## X858         -1.648811     -1.6791818
## X859         -1.605860     -1.4990925
## X860         -1.663562     -2.1959068
## X861         -1.520499     -1.6748318
## X862         -1.453060     -1.4401046
## X863         -1.497641     -1.5915268
## X864         -1.526737     -2.1091071
## X866         -1.698055     -2.3203498
## X868         -1.691083     -1.8954469
## X869         -1.594361     -2.2380872
## X871         -1.638076     -1.8597382
## X872         -1.506542     -1.4847225
## X873         -1.480137     -2.2514717
## X874         -1.593905     -2.2562827
## X875         -1.685886     -1.5114749
## X877         -1.669710     -1.6569733
## X878         -1.713449     -2.2005314
## X879         -1.654735     -2.3571020
## X880         -1.558708     -1.3869129
## X881         -1.657217     -1.9762139
## X883         -1.618012     -1.4350269
## X884         -1.506542     -1.5680169
## X885         -1.612379     -2.5679247
## X887         -1.471917     -1.7715714
## X888         -1.559579     -1.5719981
## X889         -1.747337     -2.5871077
## X890         -1.484477     -1.9163022
## X891         -1.636142     -1.6184651
## X892         -1.429075     -2.0978789
## X893         -1.412624     -0.6826914
## X894         -1.544476     -1.8771050
## X897         -1.656223     -2.2923990
## X898         -1.413348     -1.5903692
## X899         -1.475039     -1.8235956
## X900         -1.471528     -1.6399753
## X901         -1.530924     -1.3351867
## X902         -1.475821     -1.4857809
## X903         -1.583937     -1.7695612
## X904         -1.559143     -1.9574875
## X907         -1.447374     -1.2973504
## X908         -1.484477     -1.7177547
## X909         -1.440979     -1.9276134
## X910         -1.528409     -1.6196501
## X911         -1.554367     -1.6624877
## X912         -1.478172     -1.4810257
## X914         -1.461856     -1.8034913
## X915         -1.501270     -2.0538690
## X916         -1.530504     -1.7267800
## X918         -1.435005     -1.7456862
## X920         -1.664817     -1.8270941
## X921         -1.427968     -1.0696662
## X924         -1.722066     -1.9405520
## X926         -1.508171     -1.5845978
## X927         -1.589811     -2.1151915
## X929         -1.512259     -2.0373158
## X930         -1.699057     -2.2323896
## X931         -1.595732     -1.8947083
## X933         -1.510212     -2.0875956
## X934         -1.563074     -1.8201066
## X935         -1.514721     -1.9155516
## X937         -1.498044     -1.5256320
## X938         -1.500059     -1.9920239
## X939         -1.528409     -1.8201066
## X940         -1.479351     -0.8795614
## X942         -1.572324     -1.8277950
## X943         -1.423555     -1.8568664
## X944         -1.611911     -1.4694835
## X945         -1.553502     -1.5617875
## X946         -1.594361     -1.9245875
## X947         -1.610510     -1.8532856
## X948         -1.536401     -1.4365479
## X949         -1.221525     -1.1031070
## X951         -1.616599     -1.5344296
## X952         -1.725620     -2.2727630
## X954         -1.583937     -1.8235956
## X955         -1.520913     -2.0185279
## X957         -1.743107     -1.9668175
## X958         -1.613314     -2.3063064
## X959         -1.546616     -1.9405520
## X960         -1.508171     -1.6904389
## X961         -1.461856     -2.0447396
## X962         -1.381022     -1.5427374
## X963         -1.445865     -1.2325409
## X964         -1.472696     -1.6220237
## X965         -1.630368     -1.9817260
## X966         -1.474257     -1.8734676
## X968         -1.604934     -1.9298875
## X969         -1.549621     -1.7938986
## X970         -1.316639     -1.5109338
## X971         -1.495633     -2.0323892
## X972         -1.697160     -1.5316732
## X975         -1.500059     -2.2154336
## X977         -1.693330     -2.0096347
## X978         -1.453440     -1.6155076
## X979         -1.551343     -1.4025614
## X981         -1.497641     -1.6527015
## X982         -1.620845     -2.1030497
## X983         -1.625115     -1.5561527
## X984         -1.592083     -1.5174436
## X986         -1.436867     -1.7319669
## X988         -1.520085     -1.7966320
## X989         -1.533447     -1.6661779
## X990         -1.547473     -1.6303680
## X991         -1.520913     -1.7615522
## X992         -1.479351     -1.7884496
## X994         -1.486061     -1.5377457
## X995         -1.602161     -2.1265631
## X996         -1.514721     -1.6393726
## X997         -1.524652     -1.6729722
## X998         -1.647829     -2.0970190
## X999         -1.629410     -2.2983428
## X1000        -1.477780     -1.7358713
## X1001        -1.465711     -1.9559389
## X1002        -1.386615     -1.6321636
## X1003        -1.489239     -1.6472311
## X1004        -1.560888     -1.9801488
## X1005        -1.405058     -1.5471923
## X1006        -1.562636     -1.4747157
## X1007        -1.570996     -1.8313051
## X1008        -1.605860     -1.9896404
## X1009        -1.645868     -2.3274232
## X1011        -1.471139     -2.0000000
## X1012        -1.627498     -2.6386361
## X1013        -1.673109     -1.8497148
## X1014        -1.541491     -1.7516124
## X1016        -1.484873     -1.6686442
## X1017        -1.557403     -1.3333333
## X1018        -1.619427     -2.0234038
## X1019        -1.498044     -2.1996053
## X1020        -1.690457     -2.1657627
## X1021        -1.436122     -2.1766489
## X1022        -1.565266     -2.0430863
## X1024        -1.557838     -1.4974871
## X1026        -1.482896     -1.7241946
## X1027        -1.530504     -2.0455670
## X1028        -1.560888     -2.1648594
## X1029        -1.609578     -2.1513794
## X1031        -1.502484     -1.8917577
## X1032        -1.654239     -2.1300810
## X1033        -1.508579     -1.5300225
## X1035        -1.568346     -1.7496338
## X1036        -1.504916     -1.9505330
## X1037        -1.490035     -1.6172812
## X1038        -1.565705     -2.1693820
## X1039        -1.347217     -1.8778337
## X1040        -1.553935     -1.5499851
## X1041        -1.642449     -2.0790851
## X1042        -1.630847     -1.8575837
## X1044        -1.512668     -1.9367333
## X1048        -1.504510     -1.6879284
## X1049        -1.494430     -1.4720966
## X1050        -1.565266     -2.0773893
## X1051        -1.631327     -2.1204282
## X1053        -1.569228     -1.9856773
## X1054        -1.431294     -1.9551652
## X1055        -1.579897     -1.5185321
## X1056        -1.594818     -2.0364934
## X1057        -1.441353     -1.4996282
## X1058        -1.436867     -1.7769479
## X1059        -1.658213     -0.9244642
## X1062        -1.522157     -1.5076925
## X1063        -1.691396     -2.1885391
## X1064        -1.605860     -1.8583015
## X1065        -1.561761     -2.1091071
## X1067        -1.535978     -1.6303680
## X1069        -1.460319     -2.0160966
## X1070        -1.598022     -2.2864798
## X1073        -1.569228     -1.7087947
## X1074        -1.307322     -1.6285751
## X1075        -1.274705     -1.7476584
## X1076        -1.484477     -1.8426028
## X1078        -1.503699     -2.1702883
## X1079        -1.363438     -1.6435978
## X1080        -1.627020     -1.9193090
## X1081        -1.596189     -2.1398020
## X1082        -1.419165     -1.3402996
## X1083        -1.558708     -1.9028570
## X1084        -1.548761     -2.1867032
## X1085        -1.453060     -1.6155076
## X1086        -1.449644     -1.6072651
## X1087        -1.478565     -1.9613670
## X1088        -1.463396     -1.9359709
## X1089        -1.456108     -1.6090266
## X1091        -1.508579     -1.5595304
## X1092        -1.610510     -1.9551652
## X1093        -1.475430     -1.7470007
## X1095        -1.395077     -1.6618738
## X1096        -1.401122     -1.3719574
## X1099        -1.438733     -1.7925342
## X1100        -1.505728     -2.0177170
## X1101        -1.427968     -1.5322240
## X1102        -1.541066     -1.7756016
## X1103        -1.615659     -1.5245369
## X1104        -1.474257     -2.1802966
## X1107        -1.351243     -1.7776215
## X1108        -1.544476     -1.6166897
## X1110        -1.507356     -2.1442434
## X1111        -1.509395     -1.5427374
## X1112        -1.633249     -1.8334159
## X1114        -1.541066     -2.2173075
## X1115        -1.561761     -1.8910211
## X1116        -1.532184     -1.8626165
## X1117        -1.461471     -1.8554329
## X1118        -1.569228     -1.9590379
## X1119        -1.567024     -1.6160985
## X1120        -1.662208     -2.0340294
## X1121        -1.620372     -1.5527846
## X1122        -1.553935     -2.0765423
## X1123        -1.612846     -2.0530365
## X1124        -1.556969     -2.1065078
## X1125        -1.491630     -2.2390390
## X1126        -1.540640     -2.2051713
## X1127        -1.627020     -2.0201513
## X1128        -1.649795     -2.2088944
## X1130        -1.550912     -2.2163702
## X1131        -1.700430     -3.0539870
## X1132        -1.478565     -1.1276737
## X1133        -1.482501     -1.6954754
## X1134        -1.481318     -2.4065265
## X1135        -1.583937     -1.9436150
## X1137        -1.391894     -1.1284389
## 
## $usekernel
## [1] FALSE
## 
## $varnames
## [1] "texture_mean"     "smoothness_mean"  "compactness_se"   "texture_worst"   
## [5] "smoothness_worst" "symmetry_worst"  
## 
## $xNames
## [1] "texture_mean"     "smoothness_mean"  "compactness_se"   "texture_worst"   
## [5] "smoothness_worst" "symmetry_worst"  
## 
## $problemType
## [1] "Classification"
## 
## $tuneValue
##   fL usekernel adjust
## 1  2     FALSE  FALSE
## 
## $obsLevels
## [1] "M" "B"
## attr(,"ordered")
## [1] FALSE
## 
## $param
## list()
## 
## attr(,"class")
## [1] "NaiveBayes"
BAL_NB_Tune$results
##   usekernel fL adjust       ROC      Sens      Spec      ROCSD     SensSD
## 1     FALSE  2  FALSE 0.8864212 0.7576471 0.8643356 0.02628655 0.04767519
## 2      TRUE  2  FALSE       NaN       NaN       NaN         NA         NA
##       SpecSD
## 1 0.03462688
## 2         NA
(BAL_NB_Train_AUROC <- BAL_NB_Tune$results[BAL_NB_Tune$results$usekernel==BAL_NB_Tune$bestTune$usekernel &
                                                 BAL_NB_Tune$results$adjust==BAL_NB_Tune$bestTune$adjust &
                                                 BAL_NB_Tune$results$fL==BAL_NB_Tune$bestTune$fL,
                                                 c("ROC")])
## [1] 0.8864212
##################################
# Identifying and plotting the
# best model predictors
##################################
# model does not support variable importance measurement

##################################
# Independently evaluating the model
# on the test set
##################################
BAL_NB_Test <- data.frame(BAL_NB_Test_Observed = MA_Test$diagnosis,
                          BAL_NB_Test_Predicted = predict(BAL_NB_Tune,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

BAL_NB_Test
##      BAL_NB_Test_Observed BAL_NB_Test_Predicted.M BAL_NB_Test_Predicted.B
## 8                       M            9.812673e-01            0.0187326580
## 13                      M            7.881578e-01            0.2118422442
## 16                      M            9.997184e-01            0.0002815572
## 19                      M            8.671011e-01            0.1328989311
## 24                      M            7.560913e-01            0.2439087121
## 31                      M            9.928321e-01            0.0071679181
## 37                      M            9.719732e-01            0.0280268447
## 42                      M            9.921992e-01            0.0078007626
## 44                      M            9.691234e-01            0.0308765939
## 46                      M            9.385286e-01            0.0614713780
## 48                      M            9.924481e-01            0.0075519390
## 54                      M            6.304132e-01            0.3695868316
## 61                      B            2.168351e-02            0.9783164933
## 65                      M            9.973897e-01            0.0026103200
## 69                      B            9.547752e-01            0.0452247849
## 70                      B            1.469355e-03            0.9985306451
## 71                      M            1.671223e-01            0.8328777105
## 72                      B            2.901232e-03            0.9970987683
## 82                      B            7.848737e-01            0.2151262694
## 89                      B            6.771435e-01            0.3228564506
## 93                      B            1.362549e-04            0.9998637451
## 98                      B            3.105533e-01            0.6894466572
## 103                     B            1.169672e-02            0.9883032766
## 104                     B            7.226036e-01            0.2773964211
## 107                     B            9.242559e-01            0.0757441027
## 109                     M            9.977118e-01            0.0022882266
## 116                     B            7.041220e-01            0.2958779927
## 123                     M            9.747772e-01            0.0252228348
## 125                     B            1.610990e-03            0.9983890097
## 129                     B            1.254316e-01            0.8745684067
## 130                     M            9.654620e-01            0.0345379953
## 136                     M            6.257974e-01            0.3742025777
## 139                     M            7.321426e-01            0.2678574454
## 147                     M            9.976976e-01            0.0023024442
## 149                     B            3.341167e-02            0.9665883337
## 150                     B            2.292809e-03            0.9977071911
## 162                     M            3.374330e-03            0.9966256701
## 167                     B            1.619263e-06            0.9999983807
## 185                     M            5.289888e-01            0.4710111653
## 191                     M            9.997987e-01            0.0002013151
## 194                     M            9.979348e-01            0.0020651590
## 197                     M            9.948181e-01            0.0051819440
## 200                     M            9.940340e-01            0.0059659804
## 201                     B            5.432820e-01            0.4567179767
## 209                     B            9.864943e-01            0.0135057292
## 211                     M            5.680945e-01            0.4319054783
## 213                     M            1.159697e-01            0.8840303444
## 217                     B            8.273550e-01            0.1726450444
## 221                     B            9.337968e-04            0.9990662032
## 222                     B            1.921628e-02            0.9807837153
## 223                     B            2.752034e-01            0.7247966362
## 228                     B            7.988226e-03            0.9920117738
## 237                     M            9.746734e-01            0.0253266139
## 239                     B            5.506258e-01            0.4493741790
## 242                     B            5.225347e-05            0.9999477465
## 246                     B            7.820348e-01            0.2179652002
## 256                     M            5.729148e-01            0.4270851809
## 259                     M            9.899523e-01            0.0100476752
## 262                     M            1.828597e-01            0.8171403353
## 263                     M            5.831366e-01            0.4168633816
## 266                     M            9.423911e-01            0.0576089409
## 272                     B            7.236743e-04            0.9992763257
## 274                     B            1.865623e-02            0.9813437674
## 275                     M            4.477037e-01            0.5522963242
## 285                     B            1.283059e-03            0.9987169411
## 300                     B            1.452065e-01            0.8547934717
## 308                     B            7.261562e-06            0.9999927384
## 328                     B            2.634175e-04            0.9997365825
## 345                     B            2.222892e-02            0.9777710828
## 349                     B            1.684737e-02            0.9831526336
## 356                     B            4.814841e-02            0.9518515907
## 363                     B            2.610626e-01            0.7389373556
## 365                     B            4.974638e-03            0.9950253624
## 368                     B            2.272191e-01            0.7727808560
## 382                     B            4.210317e-03            0.9957896827
## 383                     B            1.044271e-02            0.9895572919
## 387                     B            9.190923e-04            0.9990809077
## 388                     B            6.015690e-05            0.9999398431
## 401                     M            9.937431e-01            0.0062568916
## 403                     B            1.523278e-02            0.9847672192
## 417                     B            8.998595e-01            0.1001405044
## 420                     B            3.812498e-01            0.6187501980
## 428                     B            6.881346e-01            0.3118654269
## 434                     M            9.404447e-01            0.0595552690
## 442                     M            8.579651e-01            0.1420348925
## 444                     B            4.442142e-03            0.9955578580
## 445                     M            3.409643e-02            0.9659035700
## 454                     B            3.366251e-03            0.9966337488
## 455                     B            1.320695e-02            0.9867930530
## 460                     B            1.629774e-01            0.8370226494
## 462                     M            9.555680e-01            0.0444320351
## 463                     B            2.543201e-02            0.9745679909
## 472                     B            2.461675e-01            0.7538324803
## 484                     B            1.534474e-02            0.9846552632
## 489                     B            1.607416e-01            0.8392584112
## 493                     M            7.806461e-01            0.2193538741
## 494                     B            2.870437e-06            0.9999971296
## 497                     B            7.165893e-01            0.2834107269
## 498                     B            7.976398e-02            0.9202360221
## 501                     B            1.444296e-02            0.9855570434
## 502                     M            9.988572e-01            0.0011427954
## 507                     B            6.060617e-01            0.3939382969
## 509                     B            3.659993e-03            0.9963400073
## 525                     B            8.206683e-02            0.9179331705
## 526                     B            3.149118e-02            0.9685088175
## 527                     B            8.699125e-01            0.1300875489
## 531                     B            3.977364e-01            0.6022636293
## 532                     B            8.885794e-01            0.1114206422
## 534                     M            4.729801e-01            0.5270199251
## 537                     M            9.182135e-01            0.0817865391
## 544                     B            3.039975e-01            0.6960024976
## 546                     B            4.950886e-01            0.5049114198
## 548                     B            1.351934e-01            0.8648065921
## 550                     B            4.716409e-01            0.5283590575
## 551                     B            3.146255e-03            0.9968537448
## 556                     B            8.591647e-01            0.1408352922
## 557                     B            1.816539e-01            0.8183460835
## 575                     M            9.765204e-01            0.0234796380
## 578                     M            9.993858e-01            0.0006142344
## 581                     M            9.190702e-01            0.0809297869
## 583                     M            3.392679e-01            0.6607320744
## 589                     B            2.089111e-02            0.9791088898
## 590                     B            1.214756e-01            0.8785244318
## 601                     M            9.982402e-01            0.0017598082
## 603                     M            9.887012e-01            0.0112988498
## 611                     M            9.921992e-01            0.0078007626
## 617                     M            9.924481e-01            0.0075519390
## 619                     B            2.963917e-01            0.7036083437
## 625                     B            6.499918e-02            0.9350008165
## 628                     B            1.393959e-03            0.9986060411
## 632                     M            9.742588e-01            0.0257411818
## 646                     B            2.993435e-04            0.9997006565
## 649                     B            2.036427e-01            0.7963572964
## 657                     M            9.294722e-01            0.0705278433
## 662                     B            1.362549e-04            0.9998637451
## 665                     M            9.244672e-01            0.0755327874
## 677                     B            4.899191e-02            0.9510080946
## 679                     B            7.539674e-01            0.2460325815
## 685                     B            7.041220e-01            0.2958779927
## 687                     M            9.800610e-01            0.0199390377
## 689                     M            4.952652e-01            0.5047347899
## 695                     B            5.323889e-03            0.9946761114
## 701                     M            6.978591e-01            0.3021408704
## 704                     M            8.173221e-01            0.1826778995
## 706                     B            1.619121e-01            0.8380878502
## 709                     B            6.283883e-03            0.9937161170
## 715                     B            4.234854e-02            0.9576514646
## 726                     M            7.962994e-01            0.2037006399
## 734                     M            7.155837e-01            0.2844163454
## 747                     M            8.794790e-01            0.1205210336
## 752                     M            9.340052e-01            0.0659947616
## 763                     M            9.979348e-01            0.0020651590
## 765                     B            5.775093e-03            0.9942249074
## 775                     M            1.718460e-01            0.8281540071
## 780                     M            5.680945e-01            0.4319054783
## 786                     B            8.273550e-01            0.1726450444
## 792                     B            2.752034e-01            0.7247966362
## 796                     B            6.297508e-03            0.9937024917
## 809                     M            9.731131e-01            0.0268868939
## 813                     B            5.222499e-02            0.9477750071
## 816                     B            7.849621e-03            0.9921503793
## 818                     B            9.291527e-01            0.0708472692
## 820                     M            8.356248e-01            0.1643752263
## 823                     M            3.733182e-01            0.6266817630
## 850                     M            9.957456e-01            0.0042544435
## 854                     B            1.283059e-03            0.9987169411
## 865                     B            1.545147e-04            0.9998454853
## 867                     M            1.965801e-02            0.9803419942
## 870                     M            8.832716e-01            0.1167284359
## 876                     B            7.248908e-04            0.9992751092
## 882                     B            7.471474e-04            0.9992528526
## 886                     B            3.482215e-06            0.9999965178
## 895                     B            3.659113e-02            0.9634088692
## 896                     B            4.521132e-04            0.9995478868
## 905                     M            8.848879e-01            0.1151120607
## 906                     B            1.450107e-03            0.9985498928
## 913                     M            9.819396e-01            0.0180604337
## 917                     B            3.754428e-03            0.9962455719
## 919                     B            5.127009e-02            0.9487299131
## 922                     M            9.370673e-01            0.0629327261
## 923                     M            9.748852e-01            0.0251147663
## 925                     B            4.814841e-02            0.9518515907
## 928                     B            8.699658e-04            0.9991300342
## 932                     B            2.610626e-01            0.7389373556
## 936                     M            9.726773e-01            0.0273226617
## 941                     B            5.017463e-05            0.9999498254
## 950                     B            1.730492e-01            0.8269507532
## 953                     B            5.039053e-01            0.4960947100
## 956                     B            9.190923e-04            0.9990809077
## 967                     B            3.855968e-03            0.9961440318
## 973                     B            6.279003e-02            0.9372099672
## 974                     B            7.967346e-05            0.9999203265
## 976                     B            4.407911e-03            0.9955920886
## 980                     B            3.506288e-01            0.6493711703
## 985                     B            8.606275e-01            0.1393724772
## 987                     M            9.539449e-01            0.0460550548
## 993                     B            1.773299e-01            0.8226700764
## 1010                    B            3.395580e-01            0.6604419928
## 1015                    B            8.060954e-01            0.1939045871
## 1023                    B            3.366251e-03            0.9966337488
## 1025                    B            5.667270e-01            0.4332729894
## 1030                    M            9.805921e-01            0.0194078603
## 1034                    B            9.308926e-03            0.9906910739
## 1043                    B            2.092386e-02            0.9790761384
## 1045                    B            2.280148e-02            0.9771985228
## 1046                    B            2.637147e-01            0.7362852638
## 1047                    B            1.747931e-04            0.9998252069
## 1052                    B            2.553411e-02            0.9744658931
## 1060                    B            3.113604e-01            0.6886396188
## 1061                    B            2.238129e-05            0.9999776187
## 1066                    B            7.165893e-01            0.2834107269
## 1068                    M            3.128620e-01            0.6871380447
## 1071                    M            9.988572e-01            0.0011427954
## 1072                    B            4.139866e-01            0.5860134296
## 1077                    B            5.430217e-01            0.4569782738
## 1090                    B            5.656148e-01            0.4343852029
## 1094                    B            8.206683e-02            0.9179331705
## 1097                    B            6.804212e-04            0.9993195788
## 1098                    B            5.610047e-03            0.9943899527
## 1105                    M            7.089731e-01            0.2910269275
## 1106                    M            9.182135e-01            0.0817865391
## 1109                    B            9.474609e-01            0.0525390867
## 1113                    B            3.039975e-01            0.6960024976
## 1129                    B            8.250674e-01            0.1749326039
## 1136                    M            5.683597e-01            0.4316402792
## 1138                    B            8.801571e-06            0.9999911984
##################################
# Reporting the independent evaluation results
# for the test set
##################################
BAL_NB_Test_ROC <- roc(response = BAL_NB_Test$BAL_NB_Test_Observed,
                       predictor = BAL_NB_Test$BAL_NB_Test_Predicted.M,
                       levels = rev(levels(BAL_NB_Test$BAL_NB_Test_Observed)))

(BAL_NB_Test_AUROC <- auc(BAL_NB_Test_ROC)[1])
## [1] 0.9038397

1.7.6 Ensemble Learner Model Development using Linear Regression (ENL)


Details.

Code Chunk | Output
##################################
# Consolidating the base learners
# with optimal hyperparameters
##################################
set.seed(12345678)
BAL_LIST <- caretList(x = MA_Train[,!names(MA_Train) %in% c("diagnosis")],
                      y = MA_Train$diagnosis,
                      trControl=RKFold_Control,
                      metric="ROC",
                      tuneList=list(
                        BAL_LDA=caretModelSpec(method="lda", 
                                               preProcess=c("center","scale")),
                        BAL_CART=caretModelSpec(method="rpart", 
                                                tuneGrid=data.frame(cp=0.001)),
                        BAL_SVM_R=caretModelSpec(method="svmRadial",
                                                 preProcess=c("center","scale"),
                                                 tuneGrid=data.frame(C = 2048, sigma = 0.1790538)),
                        BAL_KNN=caretModelSpec(method="knn",
                                               preProcess=c("center","scale"),
                                               tuneGrid=data.frame(k = 1)),
                        BAL_NB=caretModelSpec(method="nb",
                                               tuneGrid=data.frame(usekernel=FALSE,fL = 2,adjust = FALSE)))             
                        )

BAL_LIST
## $BAL_LDA
## Linear Discriminant Analysis 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 730, 730, 730, 729, 729, 729, ... 
## Resampling results:
## 
##   ROC      Sens       Spec    
##   0.87416  0.6994118  0.886029
## 
## 
## $BAL_CART
## CART 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 730, 730, 730, 729, 729, 729, ... 
## Resampling results:
## 
##   ROC        Sens       Spec     
##   0.8725597  0.7741176  0.8531411
## 
## Tuning parameter 'cp' was held constant at a value of 0.001
## 
## $BAL_SVM_R
## Support Vector Machines with Radial Basis Function Kernel 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 730, 730, 730, 729, 729, 729, ... 
## Resampling results:
## 
##   ROC        Sens       Spec     
##   0.9081946  0.8058824  0.9272616
## 
## Tuning parameter 'sigma' was held constant at a value of 0.1790538
## 
## Tuning parameter 'C' was held constant at a value of 2048
## 
## $BAL_KNN
## k-Nearest Neighbors 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## Pre-processing: centered (6), scaled (6) 
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 730, 730, 730, 729, 729, 729, ... 
## Resampling results:
## 
##   ROC        Sens       Spec     
##   0.8981405  0.8882353  0.9080458
## 
## Tuning parameter 'k' was held constant at a value of 1
## 
## $BAL_NB
## Naive Bayes 
## 
## 912 samples
##   6 predictor
##   2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 730, 730, 730, 729, 729, 729, ... 
## Resampling results:
## 
##   ROC        Sens       Spec     
##   0.8869568  0.7582353  0.8643143
## 
## Tuning parameter 'fL' was held constant at a value of 2
## Tuning
##  parameter 'usekernel' was held constant at a value of FALSE
## Tuning
##  parameter 'adjust' was held constant at a value of FALSE
## 
## attr(,"class")
## [1] "caretList"
##################################
# Comparing the base learners
# with optimal hyperparameters
##################################
BAL_LIST_RESAMPLES <- resamples(BAL_LIST)
summary(BAL_LIST_RESAMPLES)
## 
## Call:
## summary.resamples(object = BAL_LIST_RESAMPLES)
## 
## Models: BAL_LDA, BAL_CART, BAL_SVM_R, BAL_KNN, BAL_NB 
## Number of resamples: 25 
## 
## ROC 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## BAL_LDA   0.8363171 0.8556266 0.8728070 0.8741600 0.8872549 0.9164962    0
## BAL_CART  0.7928277 0.8517157 0.8767415 0.8725597 0.8908669 0.9225703    0
## BAL_SVM_R 0.8537152 0.8934469 0.9063939 0.9081946 0.9286636 0.9615583    0
## BAL_KNN   0.8582481 0.8863171 0.8900256 0.8981405 0.9080563 0.9472394    0
## BAL_NB    0.8476522 0.8715170 0.8891899 0.8869568 0.9057018 0.9246803    0
## 
## Sens 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## BAL_LDA   0.6029412 0.6617647 0.7058824 0.6994118 0.7352941 0.8235294    0
## BAL_CART  0.6764706 0.7500000 0.7794118 0.7741176 0.8088235 0.8823529    0
## BAL_SVM_R 0.7205882 0.7794118 0.8088235 0.8058824 0.8235294 0.8823529    0
## BAL_KNN   0.7941176 0.8529412 0.8823529 0.8882353 0.9264706 0.9558824    0
## BAL_NB    0.6764706 0.7205882 0.7352941 0.7582353 0.8088235 0.8529412    0
## 
## Spec 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## BAL_LDA   0.8347826 0.8684211 0.8859649 0.8860290 0.9035088 0.9391304    0
## BAL_CART  0.7807018 0.8245614 0.8608696 0.8531411 0.8869565 0.9035088    0
## BAL_SVM_R 0.8508772 0.9122807 0.9298246 0.9272616 0.9473684 0.9736842    0
## BAL_KNN   0.8333333 0.8869565 0.9043478 0.9080458 0.9304348 0.9565217    0
## BAL_NB    0.7982456 0.8421053 0.8596491 0.8643143 0.8859649 0.9304348    0
dotplot(BAL_LIST_RESAMPLES)

splom(BAL_LIST_RESAMPLES)

##################################
# Measuring the correlation among
# base learners
##################################
BAL_LIST_COR <- modelCor(resamples(BAL_LIST))

##################################
# Formulating an ensemble model
# using the base learners
##################################
set.seed(12345678)
ENL <- caretEnsemble(BAL_LIST,
                     metric="ROC",
                     trControl=RKFold_Control)
print(ENL)
## A glm ensemble of 5 base models: BAL_LDA, BAL_CART, BAL_SVM_R, BAL_KNN, BAL_NB
## 
## Ensemble results:
## Generalized Linear Model 
## 
## 4560 samples
##    5 predictor
##    2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 3648, 3648, 3648, 3648, 3648, 3648, ... 
## Resampling results:
## 
##   ROC        Sens       Spec    
##   0.9476578  0.8858824  0.922028
(ENL_Train_AUROC <- ENL$ens_model$results$ROC)
## [1] 0.9476578
##################################
# Independently evaluating the model
# on the test set
##################################
ENL_Test <- data.frame(ENL_Test_Observed = MA_Test$diagnosis,
                       ENL_Test_Predicted = predict(ENL,
                                                    MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                    type = "prob"))

ENL_Test$ENL_Test_Predicted.M <- ENL_Test$ENL_Test_Predicted

ENL_Test
##     ENL_Test_Observed ENL_Test_Predicted ENL_Test_Predicted.M
## 1                   M         0.93698484           0.93698484
## 2                   M         0.90316238           0.90316238
## 3                   M         0.97601568           0.97601568
## 4                   M         0.91077849           0.91077849
## 5                   M         0.93437192           0.93437192
## 6                   M         0.96912348           0.96912348
## 7                   M         0.94640442           0.94640442
## 8                   M         0.86974627           0.86974627
## 9                   M         0.96390978           0.96390978
## 10                  M         0.97013507           0.97013507
## 11                  M         0.97048299           0.97048299
## 12                  M         0.80221002           0.80221002
## 13                  B         0.02124624           0.02124624
## 14                  M         0.97377786           0.97377786
## 15                  B         0.30541332           0.30541332
## 16                  B         0.02099828           0.02099828
## 17                  M         0.71594387           0.71594387
## 18                  B         0.02056671           0.02056671
## 19                  B         0.26152217           0.26152217
## 20                  B         0.12122490           0.12122490
## 21                  B         0.02050352           0.02050352
## 22                  B         0.04257131           0.04257131
## 23                  B         0.04396013           0.04396013
## 24                  B         0.14459123           0.14459123
## 25                  B         0.20488627           0.20488627
## 26                  M         0.93524820           0.93524820
## 27                  B         0.86514117           0.86514117
## 28                  M         0.93285449           0.93285449
## 29                  B         0.02085434           0.02085434
## 30                  B         0.02510866           0.02510866
## 31                  M         0.92653274           0.92653274
## 32                  M         0.91582476           0.91582476
## 33                  M         0.81037042           0.81037042
## 34                  M         0.94862798           0.94862798
## 35                  B         0.05990661           0.05990661
## 36                  B         0.02129373           0.02129373
## 37                  M         0.56743637           0.56743637
## 38                  B         0.02059139           0.02059139
## 39                  M         0.88679401           0.88679401
## 40                  M         0.95930601           0.95930601
## 41                  M         0.97588417           0.97588417
## 42                  M         0.96579927           0.96579927
## 43                  M         0.97560484           0.97560484
## 44                  B         0.13604412           0.13604412
## 45                  B         0.31905334           0.31905334
## 46                  M         0.87024068           0.87024068
## 47                  M         0.70472111           0.70472111
## 48                  B         0.84716678           0.84716678
## 49                  B         0.02517130           0.02517130
## 50                  B         0.02453744           0.02453744
## 51                  B         0.03493350           0.03493350
## 52                  B         0.04507674           0.04507674
## 53                  M         0.93284071           0.93284071
## 54                  B         0.08406829           0.08406829
## 55                  B         0.02052591           0.02052591
## 56                  B         0.15755656           0.15755656
## 57                  M         0.87275682           0.87275682
## 58                  M         0.95026912           0.95026912
## 59                  M         0.63082316           0.63082316
## 60                  M         0.86809829           0.86809829
## 61                  M         0.92185725           0.92185725
## 62                  B         0.02052229           0.02052229
## 63                  B         0.04423209           0.04423209
## 64                  M         0.90013796           0.90013796
## 65                  B         0.02649049           0.02649049
## 66                  B         0.03040485           0.03040485
## 67                  B         0.02051536           0.02051536
## 68                  B         0.02050666           0.02050666
## 69                  B         0.02126592           0.02126592
## 70                  B         0.02614123           0.02614123
## 71                  B         0.03134642           0.03134642
## 72                  B         0.11493979           0.11493979
## 73                  B         0.04325631           0.04325631
## 74                  B         0.06191254           0.06191254
## 75                  B         0.02227528           0.02227528
## 76                  B         0.04325307           0.04325307
## 77                  B         0.02322909           0.02322909
## 78                  B         0.02052967           0.02052967
## 79                  M         0.93484196           0.93484196
## 80                  B         0.04345233           0.04345233
## 81                  B         0.20634108           0.20634108
## 82                  B         0.08299901           0.08299901
## 83                  B         0.10948146           0.10948146
## 84                  M         0.94731607           0.94731607
## 85                  M         0.91087710           0.91087710
## 86                  B         0.04080377           0.04080377
## 87                  M         0.53269647           0.53269647
## 88                  B         0.02061297           0.02061297
## 89                  B         0.04383689           0.04383689
## 90                  B         0.05612948           0.05612948
## 91                  M         0.96382289           0.96382289
## 92                  B         0.02332053           0.02332053
## 93                  B         0.03470112           0.03470112
## 94                  B         0.02244913           0.02244913
## 95                  B         0.02845368           0.02845368
## 96                  M         0.89267185           0.89267185
## 97                  B         0.02049774           0.02049774
## 98                  B         0.93276791           0.93276791
## 99                  B         0.04884028           0.04884028
## 100                 B         0.02392557           0.02392557
## 101                 M         0.97541992           0.97541992
## 102                 B         0.12656999           0.12656999
## 103                 B         0.02364341           0.02364341
## 104                 B         0.60219376           0.60219376
## 105                 B         0.03038320           0.03038320
## 106                 B         0.29505551           0.29505551
## 107                 B         0.08107371           0.08107371
## 108                 B         0.37623490           0.37623490
## 109                 M         0.86105870           0.86105870
## 110                 M         0.96287979           0.96287979
## 111                 B         0.11154821           0.11154821
## 112                 B         0.16941544           0.16941544
## 113                 B         0.05579544           0.05579544
## 114                 B         0.07386931           0.07386931
## 115                 B         0.02282196           0.02282196
## 116                 B         0.17065261           0.17065261
## 117                 B         0.04008923           0.04008923
## 118                 M         0.95269855           0.95269855
## 119                 M         0.97534468           0.97534468
## 120                 M         0.91998540           0.91998540
## 121                 M         0.78809492           0.78809492
## 122                 B         0.02246830           0.02246830
## 123                 B         0.02679706           0.02679706
## 124                 M         0.97562896           0.97562896
## 125                 M         0.96156820           0.96156820
## 126                 M         0.86974627           0.86974627
## 127                 M         0.97048299           0.97048299
## 128                 B         0.06944286           0.06944286
## 129                 B         0.02713114           0.02713114
## 130                 B         0.02072300           0.02072300
## 131                 M         0.92567944           0.92567944
## 132                 B         0.06590589           0.06590589
## 133                 B         0.04786883           0.04786883
## 134                 M         0.92226876           0.92226876
## 135                 B         0.02050352           0.02050352
## 136                 M         0.92166295           0.92166295
## 137                 B         0.04646474           0.04646474
## 138                 B         0.20432795           0.20432795
## 139                 B         0.86514117           0.86514117
## 140                 M         0.94355480           0.94355480
## 141                 M         0.83853249           0.83853249
## 142                 B         0.02496440           0.02496440
## 143                 M         0.80604979           0.80604979
## 144                 M         0.91402925           0.91402925
## 145                 B         0.05577617           0.05577617
## 146                 B         0.02069193           0.02069193
## 147                 B         0.02241456           0.02241456
## 148                 M         0.87189250           0.87189250
## 149                 M         0.89732060           0.89732060
## 150                 M         0.87656130           0.87656130
## 151                 M         0.95600731           0.95600731
## 152                 M         0.97588417           0.97588417
## 153                 B         0.04331679           0.04331679
## 154                 M         0.77608179           0.77608179
## 155                 M         0.87024068           0.87024068
## 156                 B         0.84716678           0.84716678
## 157                 B         0.03493350           0.03493350
## 158                 B         0.02072247           0.02072247
## 159                 M         0.92552933           0.92552933
## 160                 B         0.02877609           0.02877609
## 161                 B         0.02304007           0.02304007
## 162                 B         0.31638845           0.31638845
## 163                 M         0.93016447           0.93016447
## 164                 M         0.55664617           0.55664617
## 165                 M         0.97513038           0.97513038
## 166                 B         0.02649049           0.02649049
## 167                 B         0.02050233           0.02050233
## 168                 M         0.56291502           0.56291502
## 169                 M         0.92497981           0.92497981
## 170                 B         0.02158747           0.02158747
## 171                 B         0.04490583           0.04490583
## 172                 B         0.02049772           0.02049772
## 173                 B         0.02259309           0.02259309
## 174                 B         0.02134849           0.02134849
## 175                 M         0.86089019           0.86089019
## 176                 B         0.04511237           0.04511237
## 177                 M         0.96967602           0.96967602
## 178                 B         0.02078233           0.02078233
## 179                 B         0.02470231           0.02470231
## 180                 M         0.92219409           0.92219409
## 181                 M         0.96567901           0.96567901
## 182                 B         0.03134642           0.03134642
## 183                 B         0.04490922           0.04490922
## 184                 B         0.11493979           0.11493979
## 185                 M         0.94184407           0.94184407
## 186                 B         0.02049983           0.02049983
## 187                 B         0.05675360           0.05675360
## 188                 B         0.09446704           0.09446704
## 189                 B         0.02322909           0.02322909
## 190                 B         0.04266479           0.04266479
## 191                 B         0.02298553           0.02298553
## 192                 B         0.02050079           0.02050079
## 193                 B         0.02120569           0.02120569
## 194                 B         0.07527884           0.07527884
## 195                 B         0.45197849           0.45197849
## 196                 M         0.93061325           0.93061325
## 197                 B         0.14664425           0.14664425
## 198                 B         0.05185987           0.05185987
## 199                 B         0.25198837           0.25198837
## 200                 B         0.02061297           0.02061297
## 201                 B         0.11863038           0.11863038
## 202                 M         0.96913088           0.96913088
## 203                 B         0.02082299           0.02082299
## 204                 B         0.02419760           0.02419760
## 205                 B         0.03477747           0.03477747
## 206                 B         0.10145534           0.10145534
## 207                 B         0.02109181           0.02109181
## 208                 B         0.02138614           0.02138614
## 209                 B         0.09539717           0.09539717
## 210                 B         0.02050890           0.02050890
## 211                 B         0.93276791           0.93276791
## 212                 M         0.65252829           0.65252829
## 213                 M         0.97541992           0.97541992
## 214                 B         0.04558308           0.04558308
## 215                 B         0.11872470           0.11872470
## 216                 B         0.18484289           0.18484289
## 217                 B         0.60219376           0.60219376
## 218                 B         0.02052025           0.02052025
## 219                 B         0.02051262           0.02051262
## 220                 M         0.87701919           0.87701919
## 221                 M         0.96287979           0.96287979
## 222                 B         0.25378926           0.25378926
## 223                 B         0.11154821           0.11154821
## 224                 B         0.17287149           0.17287149
## 225                 M         0.84597953           0.84597953
## 226                 B         0.02427533           0.02427533
#################################
# Reporting the independent evaluation results
# for the test set
#################################
ENL_Test_ROC <- roc(response = ENL_Test$ENL_Test_Observed,
                    predictor = ENL_Test$ENL_Test_Predicted.M,
                    levels = rev(levels(ENL_Test$ENL_Test_Observed)))

(ENL_Test_AUROC <- auc(ENL_Test_ROC)[1])
## [1] 0.9862508

1.7.7 Meta Learner Model Development using Linear Regression (MEL_LR)


Details.

Code Chunk | Output
##################################
# Formulating a stacked model
# using the base learners
# and a linear regression meta-model
##################################
set.seed(12345678)
MEL_LR <- caretStack(BAL_LIST,
                     metric="ROC",
                     trControl=RKFold_Control,
                     method="glm")
print(MEL_LR)
## A glm ensemble of 5 base models: BAL_LDA, BAL_CART, BAL_SVM_R, BAL_KNN, BAL_NB
## 
## Ensemble results:
## Generalized Linear Model 
## 
## 4560 samples
##    5 predictor
##    2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 3648, 3648, 3648, 3648, 3648, 3648, ... 
## Resampling results:
## 
##   ROC        Sens       Spec    
##   0.9476578  0.8858824  0.922028
(MEL_LR_Train_AUROC <- MEL_LR$ens_model$results$ROC)
## [1] 0.9476578
##################################
# Independently evaluating the model
# on the test set
##################################
MEL_LR_Test <- data.frame(MEL_LR_Test_Observed = MA_Test$diagnosis,
                          MEL_LR_Test_Predicted = predict(MEL_LR,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

MEL_LR_Test$MEL_LR_Test_Predicted.M <- MEL_LR_Test$MEL_LR_Test_Predicted

MEL_LR_Test
##     MEL_LR_Test_Observed MEL_LR_Test_Predicted MEL_LR_Test_Predicted.M
## 1                      M            0.93698484              0.93698484
## 2                      M            0.90316238              0.90316238
## 3                      M            0.97601568              0.97601568
## 4                      M            0.91077849              0.91077849
## 5                      M            0.93437192              0.93437192
## 6                      M            0.96912348              0.96912348
## 7                      M            0.94640442              0.94640442
## 8                      M            0.86974627              0.86974627
## 9                      M            0.96390978              0.96390978
## 10                     M            0.97013507              0.97013507
## 11                     M            0.97048299              0.97048299
## 12                     M            0.80221002              0.80221002
## 13                     B            0.02124624              0.02124624
## 14                     M            0.97377786              0.97377786
## 15                     B            0.30541332              0.30541332
## 16                     B            0.02099828              0.02099828
## 17                     M            0.71594387              0.71594387
## 18                     B            0.02056671              0.02056671
## 19                     B            0.26152217              0.26152217
## 20                     B            0.12122490              0.12122490
## 21                     B            0.02050352              0.02050352
## 22                     B            0.04257131              0.04257131
## 23                     B            0.04396013              0.04396013
## 24                     B            0.14459123              0.14459123
## 25                     B            0.20488627              0.20488627
## 26                     M            0.93524820              0.93524820
## 27                     B            0.86514117              0.86514117
## 28                     M            0.93285449              0.93285449
## 29                     B            0.02085434              0.02085434
## 30                     B            0.02510866              0.02510866
## 31                     M            0.92653274              0.92653274
## 32                     M            0.91582476              0.91582476
## 33                     M            0.81037042              0.81037042
## 34                     M            0.94862798              0.94862798
## 35                     B            0.05990661              0.05990661
## 36                     B            0.02129373              0.02129373
## 37                     M            0.56743637              0.56743637
## 38                     B            0.02059139              0.02059139
## 39                     M            0.88679401              0.88679401
## 40                     M            0.95930601              0.95930601
## 41                     M            0.97588417              0.97588417
## 42                     M            0.96579927              0.96579927
## 43                     M            0.97560484              0.97560484
## 44                     B            0.13604412              0.13604412
## 45                     B            0.31905334              0.31905334
## 46                     M            0.87024068              0.87024068
## 47                     M            0.70472111              0.70472111
## 48                     B            0.84716678              0.84716678
## 49                     B            0.02517130              0.02517130
## 50                     B            0.02453744              0.02453744
## 51                     B            0.03493350              0.03493350
## 52                     B            0.04507674              0.04507674
## 53                     M            0.93284071              0.93284071
## 54                     B            0.08406829              0.08406829
## 55                     B            0.02052591              0.02052591
## 56                     B            0.15755656              0.15755656
## 57                     M            0.87275682              0.87275682
## 58                     M            0.95026912              0.95026912
## 59                     M            0.63082316              0.63082316
## 60                     M            0.86809829              0.86809829
## 61                     M            0.92185725              0.92185725
## 62                     B            0.02052229              0.02052229
## 63                     B            0.04423209              0.04423209
## 64                     M            0.90013796              0.90013796
## 65                     B            0.02649049              0.02649049
## 66                     B            0.03040485              0.03040485
## 67                     B            0.02051536              0.02051536
## 68                     B            0.02050666              0.02050666
## 69                     B            0.02126592              0.02126592
## 70                     B            0.02614123              0.02614123
## 71                     B            0.03134642              0.03134642
## 72                     B            0.11493979              0.11493979
## 73                     B            0.04325631              0.04325631
## 74                     B            0.06191254              0.06191254
## 75                     B            0.02227528              0.02227528
## 76                     B            0.04325307              0.04325307
## 77                     B            0.02322909              0.02322909
## 78                     B            0.02052967              0.02052967
## 79                     M            0.93484196              0.93484196
## 80                     B            0.04345233              0.04345233
## 81                     B            0.20634108              0.20634108
## 82                     B            0.08299901              0.08299901
## 83                     B            0.10948146              0.10948146
## 84                     M            0.94731607              0.94731607
## 85                     M            0.91087710              0.91087710
## 86                     B            0.04080377              0.04080377
## 87                     M            0.53269647              0.53269647
## 88                     B            0.02061297              0.02061297
## 89                     B            0.04383689              0.04383689
## 90                     B            0.05612948              0.05612948
## 91                     M            0.96382289              0.96382289
## 92                     B            0.02332053              0.02332053
## 93                     B            0.03470112              0.03470112
## 94                     B            0.02244913              0.02244913
## 95                     B            0.02845368              0.02845368
## 96                     M            0.89267185              0.89267185
## 97                     B            0.02049774              0.02049774
## 98                     B            0.93276791              0.93276791
## 99                     B            0.04884028              0.04884028
## 100                    B            0.02392557              0.02392557
## 101                    M            0.97541992              0.97541992
## 102                    B            0.12656999              0.12656999
## 103                    B            0.02364341              0.02364341
## 104                    B            0.60219376              0.60219376
## 105                    B            0.03038320              0.03038320
## 106                    B            0.29505551              0.29505551
## 107                    B            0.08107371              0.08107371
## 108                    B            0.37623490              0.37623490
## 109                    M            0.86105870              0.86105870
## 110                    M            0.96287979              0.96287979
## 111                    B            0.11154821              0.11154821
## 112                    B            0.16941544              0.16941544
## 113                    B            0.05579544              0.05579544
## 114                    B            0.07386931              0.07386931
## 115                    B            0.02282196              0.02282196
## 116                    B            0.17065261              0.17065261
## 117                    B            0.04008923              0.04008923
## 118                    M            0.95269855              0.95269855
## 119                    M            0.97534468              0.97534468
## 120                    M            0.91998540              0.91998540
## 121                    M            0.78809492              0.78809492
## 122                    B            0.02246830              0.02246830
## 123                    B            0.02679706              0.02679706
## 124                    M            0.97562896              0.97562896
## 125                    M            0.96156820              0.96156820
## 126                    M            0.86974627              0.86974627
## 127                    M            0.97048299              0.97048299
## 128                    B            0.06944286              0.06944286
## 129                    B            0.02713114              0.02713114
## 130                    B            0.02072300              0.02072300
## 131                    M            0.92567944              0.92567944
## 132                    B            0.06590589              0.06590589
## 133                    B            0.04786883              0.04786883
## 134                    M            0.92226876              0.92226876
## 135                    B            0.02050352              0.02050352
## 136                    M            0.92166295              0.92166295
## 137                    B            0.04646474              0.04646474
## 138                    B            0.20432795              0.20432795
## 139                    B            0.86514117              0.86514117
## 140                    M            0.94355480              0.94355480
## 141                    M            0.83853249              0.83853249
## 142                    B            0.02496440              0.02496440
## 143                    M            0.80604979              0.80604979
## 144                    M            0.91402925              0.91402925
## 145                    B            0.05577617              0.05577617
## 146                    B            0.02069193              0.02069193
## 147                    B            0.02241456              0.02241456
## 148                    M            0.87189250              0.87189250
## 149                    M            0.89732060              0.89732060
## 150                    M            0.87656130              0.87656130
## 151                    M            0.95600731              0.95600731
## 152                    M            0.97588417              0.97588417
## 153                    B            0.04331679              0.04331679
## 154                    M            0.77608179              0.77608179
## 155                    M            0.87024068              0.87024068
## 156                    B            0.84716678              0.84716678
## 157                    B            0.03493350              0.03493350
## 158                    B            0.02072247              0.02072247
## 159                    M            0.92552933              0.92552933
## 160                    B            0.02877609              0.02877609
## 161                    B            0.02304007              0.02304007
## 162                    B            0.31638845              0.31638845
## 163                    M            0.93016447              0.93016447
## 164                    M            0.55664617              0.55664617
## 165                    M            0.97513038              0.97513038
## 166                    B            0.02649049              0.02649049
## 167                    B            0.02050233              0.02050233
## 168                    M            0.56291502              0.56291502
## 169                    M            0.92497981              0.92497981
## 170                    B            0.02158747              0.02158747
## 171                    B            0.04490583              0.04490583
## 172                    B            0.02049772              0.02049772
## 173                    B            0.02259309              0.02259309
## 174                    B            0.02134849              0.02134849
## 175                    M            0.86089019              0.86089019
## 176                    B            0.04511237              0.04511237
## 177                    M            0.96967602              0.96967602
## 178                    B            0.02078233              0.02078233
## 179                    B            0.02470231              0.02470231
## 180                    M            0.92219409              0.92219409
## 181                    M            0.96567901              0.96567901
## 182                    B            0.03134642              0.03134642
## 183                    B            0.04490922              0.04490922
## 184                    B            0.11493979              0.11493979
## 185                    M            0.94184407              0.94184407
## 186                    B            0.02049983              0.02049983
## 187                    B            0.05675360              0.05675360
## 188                    B            0.09446704              0.09446704
## 189                    B            0.02322909              0.02322909
## 190                    B            0.04266479              0.04266479
## 191                    B            0.02298553              0.02298553
## 192                    B            0.02050079              0.02050079
## 193                    B            0.02120569              0.02120569
## 194                    B            0.07527884              0.07527884
## 195                    B            0.45197849              0.45197849
## 196                    M            0.93061325              0.93061325
## 197                    B            0.14664425              0.14664425
## 198                    B            0.05185987              0.05185987
## 199                    B            0.25198837              0.25198837
## 200                    B            0.02061297              0.02061297
## 201                    B            0.11863038              0.11863038
## 202                    M            0.96913088              0.96913088
## 203                    B            0.02082299              0.02082299
## 204                    B            0.02419760              0.02419760
## 205                    B            0.03477747              0.03477747
## 206                    B            0.10145534              0.10145534
## 207                    B            0.02109181              0.02109181
## 208                    B            0.02138614              0.02138614
## 209                    B            0.09539717              0.09539717
## 210                    B            0.02050890              0.02050890
## 211                    B            0.93276791              0.93276791
## 212                    M            0.65252829              0.65252829
## 213                    M            0.97541992              0.97541992
## 214                    B            0.04558308              0.04558308
## 215                    B            0.11872470              0.11872470
## 216                    B            0.18484289              0.18484289
## 217                    B            0.60219376              0.60219376
## 218                    B            0.02052025              0.02052025
## 219                    B            0.02051262              0.02051262
## 220                    M            0.87701919              0.87701919
## 221                    M            0.96287979              0.96287979
## 222                    B            0.25378926              0.25378926
## 223                    B            0.11154821              0.11154821
## 224                    B            0.17287149              0.17287149
## 225                    M            0.84597953              0.84597953
## 226                    B            0.02427533              0.02427533
#################################
# Reporting the independent evaluation results
# for the test set
#################################
MEL_LR_Test_ROC <- roc(response = MEL_LR_Test$MEL_LR_Test_Observed,
                    predictor = MEL_LR_Test$MEL_LR_Test_Predicted.M,
                    levels = rev(levels(MEL_LR_Test$MEL_LR_Test_Observed)))

(MEL_LR_Test_AUROC <- auc(MEL_LR_Test_ROC)[1])
## [1] 0.9862508

1.7.8 Meta Learner Model Development using Random Forest (MEL_RF)


Details.

Code Chunk | Output
##################################
# Formulating a stacked model
# using the base learners
# and a random forest meta-model
##################################
set.seed(12345678)
MEL_RF <- caretStack(BAL_LIST,
                     metric="ROC",
                     trControl=RKFold_Control,
                     method="rf")
print(MEL_RF)
## A rf ensemble of 5 base models: BAL_LDA, BAL_CART, BAL_SVM_R, BAL_KNN, BAL_NB
## 
## Ensemble results:
## Random Forest 
## 
## 4560 samples
##    5 predictor
##    2 classes: 'M', 'B' 
## 
## No pre-processing
## Resampling: Cross-Validated (5 fold, repeated 5 times) 
## Summary of sample sizes: 3648, 3648, 3648, 3648, 3648, 3648, ... 
## Resampling results across tuning parameters:
## 
##   mtry  ROC        Sens       Spec     
##   2     0.9807114  0.9345882  0.9550350
##   3     0.9807720  0.9335294  0.9555245
##   5     0.9797318  0.9323529  0.9537762
## 
## ROC was used to select the optimal model using the largest value.
## The final value used for the model was mtry = 3.
(MEL_RF_Train_AUROC <- max(MEL_RF$ens_model$results$ROC))
## [1] 0.980772
##################################
# Independently evaluating the model
# on the test set
##################################
MEL_RF_Test <- data.frame(MEL_RF_Test_Observed = MA_Test$diagnosis,
                          MEL_RF_Test_Predicted = predict(MEL_RF,
                                                          MA_Test[,!names(MA_Test) %in% c("diagnosis")],
                                                          type = "prob"))

MEL_RF_Test$MEL_RF_Test_Predicted.M <- MEL_RF_Test$MEL_RF_Test_Predicted

MEL_RF_Test
##     MEL_RF_Test_Observed MEL_RF_Test_Predicted MEL_RF_Test_Predicted.M
## 1                      M                 1.000                   1.000
## 2                      M                 0.982                   0.982
## 3                      M                 1.000                   1.000
## 4                      M                 0.950                   0.950
## 5                      M                 0.528                   0.528
## 6                      M                 1.000                   1.000
## 7                      M                 0.976                   0.976
## 8                      M                 0.884                   0.884
## 9                      M                 1.000                   1.000
## 10                     M                 0.924                   0.924
## 11                     M                 0.902                   0.902
## 12                     M                 0.950                   0.950
## 13                     B                 0.036                   0.036
## 14                     M                 1.000                   1.000
## 15                     B                 0.350                   0.350
## 16                     B                 0.016                   0.016
## 17                     M                 0.700                   0.700
## 18                     B                 0.024                   0.024
## 19                     B                 0.232                   0.232
## 20                     B                 0.274                   0.274
## 21                     B                 0.000                   0.000
## 22                     B                 0.074                   0.074
## 23                     B                 0.034                   0.034
## 24                     B                 0.298                   0.298
## 25                     B                 0.054                   0.054
## 26                     M                 1.000                   1.000
## 27                     B                 0.738                   0.738
## 28                     M                 0.992                   0.992
## 29                     B                 0.000                   0.000
## 30                     B                 0.024                   0.024
## 31                     M                 0.964                   0.964
## 32                     M                 0.842                   0.842
## 33                     M                 0.666                   0.666
## 34                     M                 0.998                   0.998
## 35                     B                 0.638                   0.638
## 36                     B                 0.012                   0.012
## 37                     M                 0.784                   0.784
## 38                     B                 0.000                   0.000
## 39                     M                 0.900                   0.900
## 40                     M                 1.000                   1.000
## 41                     M                 1.000                   1.000
## 42                     M                 1.000                   1.000
## 43                     M                 0.978                   0.978
## 44                     B                 0.248                   0.248
## 45                     B                 0.266                   0.266
## 46                     M                 0.916                   0.916
## 47                     M                 0.978                   0.978
## 48                     B                 0.592                   0.592
## 49                     B                 0.000                   0.000
## 50                     B                 0.000                   0.000
## 51                     B                 0.012                   0.012
## 52                     B                 0.000                   0.000
## 53                     M                 1.000                   1.000
## 54                     B                 0.304                   0.304
## 55                     B                 0.000                   0.000
## 56                     B                 0.260                   0.260
## 57                     M                 0.542                   0.542
## 58                     M                 1.000                   1.000
## 59                     M                 0.778                   0.778
## 60                     M                 0.930                   0.930
## 61                     M                 0.998                   0.998
## 62                     B                 0.002                   0.002
## 63                     B                 0.000                   0.000
## 64                     M                 0.806                   0.806
## 65                     B                 0.000                   0.000
## 66                     B                 0.162                   0.162
## 67                     B                 0.000                   0.000
## 68                     B                 0.000                   0.000
## 69                     B                 0.004                   0.004
## 70                     B                 0.016                   0.016
## 71                     B                 0.002                   0.002
## 72                     B                 0.142                   0.142
## 73                     B                 0.000                   0.000
## 74                     B                 0.032                   0.032
## 75                     B                 0.002                   0.002
## 76                     B                 0.002                   0.002
## 77                     B                 0.000                   0.000
## 78                     B                 0.000                   0.000
## 79                     M                 1.000                   1.000
## 80                     B                 0.104                   0.104
## 81                     B                 0.002                   0.002
## 82                     B                 0.008                   0.008
## 83                     B                 0.636                   0.636
## 84                     M                 1.000                   1.000
## 85                     M                 0.994                   0.994
## 86                     B                 0.000                   0.000
## 87                     M                 0.958                   0.958
## 88                     B                 0.002                   0.002
## 89                     B                 0.124                   0.124
## 90                     B                 0.044                   0.044
## 91                     M                 0.826                   0.826
## 92                     B                 0.022                   0.022
## 93                     B                 0.004                   0.004
## 94                     B                 0.022                   0.022
## 95                     B                 0.128                   0.128
## 96                     M                 0.982                   0.982
## 97                     B                 0.000                   0.000
## 98                     B                 0.832                   0.832
## 99                     B                 0.008                   0.008
## 100                    B                 0.118                   0.118
## 101                    M                 1.000                   1.000
## 102                    B                 0.020                   0.020
## 103                    B                 0.088                   0.088
## 104                    B                 0.788                   0.788
## 105                    B                 0.002                   0.002
## 106                    B                 0.632                   0.632
## 107                    B                 0.046                   0.046
## 108                    B                 0.738                   0.738
## 109                    M                 0.970                   0.970
## 110                    M                 0.972                   0.972
## 111                    B                 0.376                   0.376
## 112                    B                 0.036                   0.036
## 113                    B                 0.020                   0.020
## 114                    B                 0.268                   0.268
## 115                    B                 0.000                   0.000
## 116                    B                 0.530                   0.530
## 117                    B                 0.062                   0.062
## 118                    M                 0.996                   0.996
## 119                    M                 1.000                   1.000
## 120                    M                 0.914                   0.914
## 121                    M                 0.952                   0.952
## 122                    B                 0.024                   0.024
## 123                    B                 0.000                   0.000
## 124                    M                 1.000                   1.000
## 125                    M                 0.996                   0.996
## 126                    M                 0.884                   0.884
## 127                    M                 0.902                   0.902
## 128                    B                 0.098                   0.098
## 129                    B                 0.014                   0.014
## 130                    B                 0.020                   0.020
## 131                    M                 1.000                   1.000
## 132                    B                 0.008                   0.008
## 133                    B                 0.204                   0.204
## 134                    M                 1.000                   1.000
## 135                    B                 0.000                   0.000
## 136                    M                 1.000                   1.000
## 137                    B                 0.000                   0.000
## 138                    B                 0.200                   0.200
## 139                    B                 0.738                   0.738
## 140                    M                 0.998                   0.998
## 141                    M                 0.964                   0.964
## 142                    B                 0.014                   0.014
## 143                    M                 0.968                   0.968
## 144                    M                 0.974                   0.974
## 145                    B                 0.466                   0.466
## 146                    B                 0.060                   0.060
## 147                    B                 0.004                   0.004
## 148                    M                 0.946                   0.946
## 149                    M                 0.948                   0.948
## 150                    M                 0.934                   0.934
## 151                    M                 0.970                   0.970
## 152                    M                 1.000                   1.000
## 153                    B                 0.000                   0.000
## 154                    M                 0.786                   0.786
## 155                    M                 0.916                   0.916
## 156                    B                 0.592                   0.592
## 157                    B                 0.012                   0.012
## 158                    B                 0.000                   0.000
## 159                    M                 0.994                   0.994
## 160                    B                 0.086                   0.086
## 161                    B                 0.000                   0.000
## 162                    B                 0.600                   0.600
## 163                    M                 0.908                   0.908
## 164                    M                 0.090                   0.090
## 165                    M                 1.000                   1.000
## 166                    B                 0.000                   0.000
## 167                    B                 0.000                   0.000
## 168                    M                 0.666                   0.666
## 169                    M                 0.992                   0.992
## 170                    B                 0.000                   0.000
## 171                    B                 0.000                   0.000
## 172                    B                 0.000                   0.000
## 173                    B                 0.002                   0.002
## 174                    B                 0.000                   0.000
## 175                    M                 0.866                   0.866
## 176                    B                 0.000                   0.000
## 177                    M                 0.942                   0.942
## 178                    B                 0.006                   0.006
## 179                    B                 0.000                   0.000
## 180                    M                 1.000                   1.000
## 181                    M                 0.956                   0.956
## 182                    B                 0.002                   0.002
## 183                    B                 0.000                   0.000
## 184                    B                 0.142                   0.142
## 185                    M                 1.000                   1.000
## 186                    B                 0.000                   0.000
## 187                    B                 0.030                   0.030
## 188                    B                 0.094                   0.094
## 189                    B                 0.000                   0.000
## 190                    B                 0.000                   0.000
## 191                    B                 0.000                   0.000
## 192                    B                 0.000                   0.000
## 193                    B                 0.002                   0.002
## 194                    B                 0.208                   0.208
## 195                    B                 0.790                   0.790
## 196                    M                 1.000                   1.000
## 197                    B                 0.446                   0.446
## 198                    B                 0.028                   0.028
## 199                    B                 0.498                   0.498
## 200                    B                 0.002                   0.002
## 201                    B                 0.160                   0.160
## 202                    M                 0.998                   0.998
## 203                    B                 0.164                   0.164
## 204                    B                 0.006                   0.006
## 205                    B                 0.000                   0.000
## 206                    B                 0.172                   0.172
## 207                    B                 0.000                   0.000
## 208                    B                 0.016                   0.016
## 209                    B                 0.146                   0.146
## 210                    B                 0.000                   0.000
## 211                    B                 0.832                   0.832
## 212                    M                 0.448                   0.448
## 213                    M                 1.000                   1.000
## 214                    B                 0.058                   0.058
## 215                    B                 0.200                   0.200
## 216                    B                 0.216                   0.216
## 217                    B                 0.788                   0.788
## 218                    B                 0.000                   0.000
## 219                    B                 0.000                   0.000
## 220                    M                 0.954                   0.954
## 221                    M                 0.972                   0.972
## 222                    B                 0.202                   0.202
## 223                    B                 0.376                   0.376
## 224                    B                 0.418                   0.418
## 225                    M                 0.932                   0.932
## 226                    B                 0.036                   0.036
#################################
# Reporting the independent evaluation results
# for the test set
#################################
MEL_RF_Test_ROC <- roc(response = MEL_RF_Test$MEL_RF_Test_Observed,
                    predictor = MEL_RF_Test$MEL_RF_Test_Predicted.M,
                    levels = rev(levels(MEL_RF_Test$MEL_RF_Test_Observed)))

(MEL_RF_Test_AUROC <- auc(MEL_RF_Test_ROC)[1])
## [1] 0.9884306

1.8 Algorithm Comparison Summary


Details.

Code Chunk | Output
##################################
# Consolidating the resampling results
# for the formulated individual models
##################################
(Consolidated_Resampling <- resamples(list(MBS_AB = MBS_AB_Tune,
                                           MBS_GBM = MBS_GBM_Tune,
                                           MBS_XGB = MBS_XGB_Tune,
                                           MBG_RF = MBG_RF_Tune,
                                           MBG_BTREE = MBG_BTREE_Tune,
                                           BAL_LDA = BAL_LDA_Tune,
                                           BAL_CART = BAL_CART_Tune,
                                           BAL_KNN = BAL_KNN_Tune,
                                           BAL_NB = BAL_NB_Tune)))
## 
## Call:
## resamples.default(x = list(MBS_AB = MBS_AB_Tune, MBS_GBM =
##  = MBG_BTREE_Tune, BAL_LDA = BAL_LDA_Tune, BAL_CART = BAL_CART_Tune, BAL_KNN
##  = BAL_KNN_Tune, BAL_NB = BAL_NB_Tune))
## 
## Models: MBS_AB, MBS_GBM, MBS_XGB, MBG_RF, MBG_BTREE, BAL_LDA, BAL_CART, BAL_KNN, BAL_NB 
## Number of resamples: 25 
## Performance metrics: ROC, Sens, Spec 
## Time estimates for: everything, final model fit
summary(Consolidated_Resampling)
## 
## Call:
## summary.resamples(object = Consolidated_Resampling)
## 
## Models: MBS_AB, MBS_GBM, MBS_XGB, MBG_RF, MBG_BTREE, BAL_LDA, BAL_CART, BAL_KNN, BAL_NB 
## Number of resamples: 25 
## 
## ROC 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## MBS_AB    0.9316305 0.9567775 0.9663683 0.9647554 0.9751032 0.9858101    0
## MBS_GBM   0.9202786 0.9524297 0.9627193 0.9599595 0.9720072 0.9840041    0
## MBS_XGB   0.9176471 0.9510230 0.9667183 0.9589816 0.9720072 0.9847781    0
## MBG_RF    0.9299872 0.9576726 0.9669763 0.9609714 0.9732972 0.9778767    0
## MBG_BTREE 0.9247291 0.9515985 0.9614938 0.9579040 0.9717492 0.9786507    0
## BAL_LDA   0.8184143 0.8556502 0.8810630 0.8736974 0.8914322 0.9135550    0
## BAL_CART  0.8157250 0.8470588 0.8759030 0.8699967 0.8950128 0.9122162    0
## BAL_KNN   0.8355263 0.8916880 0.9033887 0.8999215 0.9180946 0.9486584    0
## BAL_NB    0.8240409 0.8697110 0.8884159 0.8864212 0.9076367 0.9236573    0
## 
## Sens 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## MBS_AB    0.7794118 0.8676471 0.9117647 0.9011765 0.9411765 0.9558824    0
## MBS_GBM   0.7352941 0.8823529 0.8970588 0.8970588 0.9264706 0.9705882    0
## MBS_XGB   0.7500000 0.8676471 0.9117647 0.8970588 0.9264706 0.9705882    0
## MBG_RF    0.7941176 0.8529412 0.9117647 0.8970588 0.9411765 0.9705882    0
## MBG_BTREE 0.7794118 0.8676471 0.9117647 0.8976471 0.9264706 0.9558824    0
## BAL_LDA   0.6176471 0.6617647 0.6911765 0.6988235 0.7352941 0.8088235    0
## BAL_CART  0.6617647 0.7205882 0.7647059 0.7600000 0.7941176 0.8823529    0
## BAL_KNN   0.7500000 0.8529412 0.8970588 0.8841176 0.9264706 0.9558824    0
## BAL_NB    0.6617647 0.7205882 0.7500000 0.7576471 0.7941176 0.8676471    0
## 
## Spec 
##                Min.   1st Qu.    Median      Mean   3rd Qu.      Max. NA's
## MBS_AB    0.8947368 0.9210526 0.9385965 0.9398352 0.9649123 0.9739130    0
## MBS_GBM   0.8859649 0.9217391 0.9385965 0.9405370 0.9649123 0.9824561    0
## MBS_XGB   0.9035088 0.9298246 0.9391304 0.9405492 0.9561404 0.9739130    0
## MBG_RF    0.9035088 0.9304348 0.9391304 0.9436949 0.9565217 0.9826087    0
## MBG_BTREE 0.9035088 0.9385965 0.9391304 0.9457818 0.9565217 0.9826087    0
## BAL_LDA   0.8157895 0.8508772 0.8869565 0.8831976 0.9043478 0.9304348    0
## BAL_CART  0.8157895 0.8596491 0.8684211 0.8688787 0.8859649 0.9210526    0
## BAL_KNN   0.8684211 0.8947368 0.9130435 0.9157254 0.9385965 0.9565217    0
## BAL_NB    0.7894737 0.8434783 0.8684211 0.8643356 0.8869565 0.9217391    0
##################################
# Exploring the resampling results
# for the formulated individual models
##################################
bwplot(Consolidated_Resampling,
       main = "Model Resampling Performance Comparison (Range)",
       ylab = "Model",
       pch=16,
       cex=2,
       layout=c(3,1))

##################################
# Consolidating the train and test AUROC
# for the formulated individual models
# together with the ensemble and stacked models
##################################

Model <- c('MBS_AB','MBS_GBM','MBS_XGB',
           'MBG_RF','MBG_BTREE',
           'BAL_LDA','BAL_CART','BAL_SVM_R','BAL_KNN','BAL_NB',
           'ENL','MEL_LR','MEL_RF',
           'MBS_AB','MBS_GBM','MBS_XGB',
           'MBG_RF','MBG_BTREE',
           'BAL_LDA','BAL_CART','BAL_SVM_R','BAL_KNN','BAL_NB',
           'ENL','MEL_LR','MEL_RF')

Set <- c(rep('Cross-Validation',13),rep('Test',13))

AUROC <- c(MBS_AB_Train_AUROC,MBS_GBM_Train_AUROC,MBS_XGB_Train_AUROC,
           MBG_RF_Train_AUROC,MBG_BTREE_Train_AUROC,
           BAL_LDA_Train_AUROC,BAL_CART_Train_AUROC,BAL_SVM_R_Train_AUROC,BAL_KNN_Train_AUROC,BAL_NB_Train_AUROC,
           ENL_Train_AUROC,MEL_LR_Train_AUROC,MEL_RF_Train_AUROC,
           MBS_AB_Test_AUROC,MBS_GBM_Test_AUROC,MBS_XGB_Test_AUROC,
           MBG_RF_Test_AUROC,MBG_BTREE_Test_AUROC,
           BAL_LDA_Test_AUROC,BAL_CART_Test_AUROC,BAL_SVM_R_Test_AUROC,BAL_KNN_Test_AUROC,BAL_NB_Test_AUROC,
           ENL_Test_AUROC,MEL_LR_Test_AUROC,MEL_RF_Test_AUROC)

AUROC_Summary <- as.data.frame(cbind(Model,Set,AUROC))

AUROC_Summary$AUROC <- as.numeric(as.character(AUROC_Summary$AUROC))
AUROC_Summary$Set <- factor(AUROC_Summary$Set,
                            levels = c("Cross-Validation",
                                       "Test"))
AUROC_Summary$Model <- factor(AUROC_Summary$Model,
                              levels = c('MBS_AB',
                                         'MBS_GBM',
                                         'MBS_XGB',
                                         'MBG_RF',
                                         'MBG_BTREE',
                                         'BAL_LDA',
                                         'BAL_CART',
                                         'BAL_SVM_R',
                                         'BAL_KNN',
                                         'BAL_NB',
                                         'ENL',
                                         'MEL_LR',
                                         'MEL_RF'))

print(AUROC_Summary, row.names=FALSE)
##      Model              Set     AUROC
##     MBS_AB Cross-Validation 0.9647554
##    MBS_GBM Cross-Validation 0.9599595
##    MBS_XGB Cross-Validation 0.9589816
##     MBG_RF Cross-Validation 0.9609714
##  MBG_BTREE Cross-Validation 0.9579040
##    BAL_LDA Cross-Validation 0.8736974
##   BAL_CART Cross-Validation 0.8699967
##  BAL_SVM_R Cross-Validation 0.9095067
##    BAL_KNN Cross-Validation 0.8999215
##     BAL_NB Cross-Validation 0.8864212
##        ENL Cross-Validation 0.9476578
##     MEL_LR Cross-Validation 0.9476578
##     MEL_RF Cross-Validation 0.9807720
##     MBS_AB             Test 0.9936284
##    MBS_GBM             Test 0.9825620
##    MBS_XGB             Test 0.9830651
##     MBG_RF             Test 0.9935446
##  MBG_BTREE             Test 0.9928739
##    BAL_LDA             Test 0.8984742
##   BAL_CART             Test 0.8843478
##  BAL_SVM_R             Test 0.9159121
##    BAL_KNN             Test 0.9718310
##     BAL_NB             Test 0.9038397
##        ENL             Test 0.9862508
##     MEL_LR             Test 0.9862508
##     MEL_RF             Test 0.9884306
(AUROC_Plot <- dotplot(Model ~ AUROC,
                           data = AUROC_Summary,
                           groups = Set,
                           main = "Classification Model Performance Comparison",
                           ylab = "Model",
                           xlab = "AUROC",
                           auto.key = list(adj = 1),
                           type=c("p", "h"),
                           origin = 0,
                           alpha = 0.45,
                           pch = 16,
                           cex = 2))

2. References


[Book] Applied Predictive Modeling by Max Kuhn and Kjell Johnson
[Book] An Introduction to Statistical Learning by Gareth James, Daniela Witten, Trevor Hastie and Rob Tibshirani
[Book] Multivariate Data Visualization with R by Deepayan Sarkar
[Book] Machine Learning by Samuel Jackson
[Book] Data Modeling Methods by Jacob Larget
[Book] Introduction to R and Statistics by University of Western Australia
[Book] Feature Engineering and Selection: A Practical Approach for Predictive Models by Max Kuhn and Kjell Johnson
[Book] Introduction to Research Methods by Eric van Holm
[R Package] AppliedPredictiveModeling by Max Kuhn
[R Package] caret by Max Kuhn
[R Package] rpart by Terry Therneau and Beth Atkinson
[R Package] lattice by Deepayan Sarkar
[R Package] dplyr by Hadley Wickham
[R Package] tidyr by Hadley Wickham
[R Package] moments by Lukasz Komsta and Frederick
[R Package] skimr by Elin Waring
[R Package] RANN by Sunil Arya, David Mount, Samuel Kemp and Gregory Jefferis
[R Package] corrplot by Taiyun Wei
[R Package] tidyverse by Hadley Wickham
[R Package] lares by Bernardo Lares
[R Package] DMwR by Luis Torgo
[R Package] gridExtra by Baptiste Auguie and Anton Antonov
[R Package] rattle by Graham Williams
[R Package] RColorBrewer by Erich Neuwirth
[R Package] stats by R Core Team
[R Package] caretEnsemble by Zachary Deane-Mayer
[Article] A Brief Introduction to caretEnsemble) by Zachary Deane-Mayer
[Article] How to Build an Ensemble Of Machine Learning Algorithms in R by Jason Brownlee
[Article] Ensemble Learning: Bagging, Boosting, and Stacking by Towards AI Team
[Article] Bagging, Boosting, and Stacking in Machine Learning by Emmanuella Budu
[Article] Stacking Ensemble Machine Learning With Python by Jason Brownlee
[Article] Essence of Boosting Ensembles for Machine Learning by Jason Brownlee
[Article] Ensemble Modeling with R by Deepika Singh
[Article] Creating Ensemble Models in R by Dustin Rogers
[Publication] Ensemble Selection from Libraries of Models by Rich Caruana, Alexandru Niculescu-Mizil, Geoff Crew and Alex Ksikes (Proceedings of the 21 st International Conference on Machine Learning,)
[Course] Applied Data Mining and Statistical Learning by Penn State Eberly College of Science